Nature Bites Back / The “Tick” Thing – Diseases

Two articles about tick-born disease dangers: one social media; one from Washington University in St Louis.  Do “ASD” Asperger people run away from environmental threats (chemicals, noise, light, crowding, disease – “sick buildings”; pathogens, parasites, etc) due to a natural “ecological fear response”?

Oh, Lovely: The Tick That Gives People Meat Allergies Is Spreading

06/20/2017

By Megan Molteni for WIRED.

First comes the unscratchable itching, and the angry blossoming of hives. Then stomach cramping, and — for the unluckiest few — difficulty breathing, passing out, and even death. In the last decade and a half, thousands of previously protein-loving Americans have developed a dangerous allergy to meat. And they all have one thing in common: the lone star tick.

Red meat, you might be surprised to know, isn’t totally sugar-free. It contains a few protein-linked saccharides, including one called galactose-alpha-1,3-galactose, or alpha-gal, for short. More and more people are learning this the hard way, when they suddenly develop a life-threatening allergy to that pesky sugar molecule after a tick bite.

Yep, one bite from the lone star tick — which gets its name from the Texas-shaped splash of white on its back — is enough to reprogram your immune system to forever reject even the smallest nibble of perfectly crisped bacon. For years, physicians and researchers only reported the allergy in places the lone star tick calls home, namely the southeastern United States. But recently it’s started to spread.

The newest hot spots? Duluth, Minnesota, Hanover, New Hampshire, and the eastern tip of Long Island, where at least 100 cases have been reported in the last year. Scientists are racing to trace its spread, to understand if the lone star tick is expanding into new territories, or if other species of ticks are now causing the allergy.

RELATED: Lyme Isn’t the Only Disease Ticks Are Spreading This Summer

The University of Virginia is deep in the heart of lone star tick country. It’s also home to a world-class allergy research division, headed up by immunologist Thomas Platts-Mills. He’d been hearing tales of the meat allergy since the ’90s — people waking up in the middle of the night after a big meal, sweating and breaking out in hives. But he didn’t give it much thought until 2004, when he heard about another group of patients all suffering from the same symptoms.

This time, it wasn’t a plate of pork chops they shared; it was a new cancer drug called cetuximab. The drug worked, but curiously, patients that lived in the southeast were 10 times as likely to report side effects of itching, swelling, and a dangerous drop in blood pressure.

Platts-Mills teamed up with cetuximab’s distributor, Bristol-Myers Squibb, and began comparing patient blood samples. He discovered that all the patients who experienced an allergic reaction had pre-existing antibodies to alpha-gal, and cetuximab was full of the stuff, thanks to the genetically modified mice from which it was derived. With that mystery solved, Platts-Mills turned to figuring out what made patients so sensitive to alpha-gal.

The best hint he had was the geographic overlap between the cetuximab patients and previously reported meat allergies. The area perfectly matched where people came down with Rocky Mountain spotted fever — a disease carried by the lone star tick. But it wasn’t until Platts-Mills and two of his lab members came down with tick-induced meat allergies of their own that they made the connection.

READ MORE: Blood in a Mosquito’s Belly Could Reveal How Diseases Spread

Over the next few years Platts-Mills and his colleague Scott Commins screened more meat allergy patients and discovered that 80 percent reported being bitten by a tick. What’s more, they showed that tick bites led to a 20-fold increase in alpha-gal antibodies. Since ethics standards prevented them from attaching ticks to randomized groups of patients, this data was the best they could do to guess how meat allergy arises. Something in the tick’s saliva hijacks humans’ immune systems, red-flagging alpha-gal, and triggering the massive release of histamines whenever red meat is consumed.

Researchers are still trying to find what that something is. Commins has since moved to the University of North Carolina, where he’s injecting mice with lone star tick extracts to try to understand which molecules are setting off the alpha-gal bomb. It’s tricky: Tick saliva is packed with tons of bioactive compounds to help the parasite feed without detection. One of them might be an alpha-gal analogue — something similar-but-different-enough in shape that it sets off the human immune system. But it could also be a microbe — like a bacteria or virus — that triggers the response. Some have even suggested that residual proteins from the ticks’ earlier blood meals could be the culprit.

RELATED: Zika Isn’t the Mosquito-Borne Virus You Should Fear

Whatever it is, allergy researchers will be paying attention. Because, as far as anyone can tell, alpha-gal syndrome seems to be the only allergy that affects all people, regardless of genetic makeup. “There’s something really special about this tick,” says Jeff Wilson, an asthma, allergy, and immunology fellow in Platts-Mills’ group. Usually a mix of genes and environmental factors combine to create allergies. But when it comes to the lone star tick it doesn’t matter if you’re predisposed or not. “Just a few bites and you can render anyone really, really allergic,” he says.

In the meantime, Platts-Mills, Commins, and Wilson are busy communicating the scale of the public health problem. Every day they check local news headlines to log new cases of catastrophic hamburger aversion, and spend hours on the phone gathering the latest intel from allergy clinics and academic centers around the country. They’re building the first real red meat allergy incidence map of the U.S. — because state health departments aren’t required to report alpha-gal syndrome to the Centers for Disease Control and Prevention. And it’s still rare enough outside the southeastern US that many doctors don’t correctly diagnose it.

Wilson is trying to get blood samples from all the new outbreaks, to figure out if the patients’ antibodies correspond to the saliva of lone star ticks or a different tick species. That will tell him if the increases in the allergy are the result of changing range patterns, or if other ticks have developed the capacity to rewire human immune systems in the same way. That information would also provide further clues to the mechanism itself. As for a cure? There’s not much science has to offer on that front, besides Epipens and veggie burgers.

___________________________________

Don’t know how accurate this list is, but it points out that ticks need moisture and woody environments to thrive. After 22 years in Wyoming, with my dogs running all over the desert shrub-land,  I’ve had to remove maybe 2-3 ticks.

Top 5 states for Dogs with Fleas: Arkansas, Florida, South Carolina, Alabama, Oregon Top 5 States for Cats with Fleas: Oregon. Washington, Florida, California, Alabama The least common? Semi-arid and Desert regions of the U.S. of the North, Midwest and Western states. Bottom 5 States for Dogs with Fleas: Utah, Montana, Nevada, Arizona, South Dakota Bottom 5 States for Cats with Fleas: Utah, Colorado, Nevada, Montana, South Dakota

____________________________________

From people who study ticks:  https://source/wustl.edu/2012/02

Study extends the ‘ecology of fear’ to fear of parasites

Squirrels and raccoons will give up food to avoid ticks

By Diana Lutz February 24, 2012 January 13, 2016

Here’s a riddle: What’s the difference between a tick and a lion? The answer used to be that a tick is a parasite and the lion is a predator. But now those definitions don’t seem as secure as they once did. A tick also hunts its prey, following vapor trails of carbon dioxide, and consumes host tissues (blood is considered a tissue), so at least in terms of its interactions with other creatures, it is like a lion — a very small, eight-legged lion.

Ecologists are increasingly finding it useful to think of parasites, such as ticks, as micro-predators and have been mining predator-prey theory for insights into parasite-host ecology. One of those insights is that predators don’t just graze at will, and prey aren’t just so many steaks in a freezer. Instead, prey make predators work for dinner by moving elsewhere, being vigilant, flocking together or taking other defensive measures.

This notion that prey are not victims but players, as strongly motivated by fear as the predators are by hunger, is called the ecology of fear.

Work at Washington University in St. Louis, just published in EcoHealth, shows that the ecology of fear, like other concepts from predator-prey theory, also extends to parasites. Raccoons and squirrels would give up food, the study demonstrated, if the area was infested with larval ticks. At some level, they are weighing the value of the abandoned food against the risk of being parasitized.

This new understanding of the interaction between ticks and host animals has implications for human health because the ticks are vectors of several newly emerging diseases. The more we know about what determines the distributions of ticks in their environment, the better prepared we will be to avoid human exposure to these diseases.

Do host animals fear ticks?

The study’s first author, Alexa Fritzsche, collaborated with Brian Allan, PhD, now an assistant professor of entomology at the University of Illinois at Urbana-Champaign.

Is there an animal left in the wild that isn’t part of a science study?

Two young raccoons visiting a feeding tray for breakfast become unwitting participants in the study. (Credit: FRITzsche)

By the time Allan finished his postdoctoral fellowship at WUSTL, he had acquired a reputation as the tick man of Tyson Research Center, the university’s biological field station. So it was only natural that when Fritzsche, then Allan’s summer research technician, was given time to do research of her own, she decided to see if the ecology of fear extends to ticks.

Fritzsche now is a doctoral candidate in the Odum School of Ecology at the University of Georgia and is studying the role that animal behavior plays in determining the risk of parasitism

Near St. Louis, the most prevalent tick is Amblyomma americanum, called the lone star tick because the adult female has a white splotch on her back. Its larval stage heavily parasitizes small mammals, such as gray and fox squirrels and the common raccoon. Because the ticks can weaken an animal either by exposing it to pathogens or simply by consuming vast quantities of its blood, it made sense to ask whether the host animals were aware of the ticks and able to avoid them.

“It really comes down to natural selection,” Fritzsche says. “There is a cost to being parasitized, and if you don’t develop ways to detect the parasite and avoid it, you’re not going to do well in the long term.”

What will they give up to avoid ticks?

The study was designed to take advantage of the fact that lone star tick larvae (sometimes called “seed ticks”) emerge from eggs in the leaf litter in mid- to late-summer and tick densities increase as more and more ticks emerge. Larval tick densities were measured by dragging a cloth to which “questing” ticks became attached, and counting and identifying the ticks in the laboratory.

“The tick larvae are only about the size of a poppyseed,” Fritzsche says, “but they are present in such great numbers that you can look down and see a mass of them on the ground.

“When you dragged over one of these ‘tick bombs,’” she says, “the ticks could scatter across the cloth within seconds. I walked with a loop of duct-tape around my hand and as soon as I saw a mass, I’d hit the cloth with the duct tape and they’d be stuck on the tape.”

The animals’ response to the ticks (raccoons) was measured by how much food they abandoned, called the giving-up-density (GUD). This metric for assessing tradeoffs between foraging benefits and predation risks is well-established in predator-prey ecology but has only recently been used to assess the ecology of fear in host-parasite interactions.

Run for your lives

Contrary to Fritzsche’s expectations, the animals didn’t abandon the ground-level trays as soon as the ticks began to emerge. Over the course of the study, tick numbers increased — but in a patchy fashion. Some sites had only one tick per 60 square meters; others had 667.

Now, the animals began to abandon more seed from trays at sites with high tick densities regardless of whether they were on the ground or in a tree. The result suggests that the host animals may recognize the threat of parasitism and adjust their patterns of foraging accordingly.

The Center for Disease Control: reported cases of Ehrlichiosis chaffeensis, the most common of the emerging diseases carried by the lonestar tick (Amblyomma americanum). Oklahoma, Missouri and Arkansas account for 35 percent of all reported E. chaffeensis infections. The incidence of ehrlichiosis has gone steadily up since the disease became reportable in 2000 but thankfully the case fatality rate has declined. (Credit: CDC)

“We thought that they might abandon more seed on the ground than in the tree because ticks are confined to the ground, so we expected more of a local trade-off in foraging,” Allan says. “It turned out that the hosts were actually avoiding entire areas of high tick densities, suggesting potentially an even stronger response to the risk of parasitism than we initially hypothesized.”

Apparently people have underestimated both the ticks and their furry hosts, which far from blundering about obliviously, are wary of threats to their health the size of the period at the end of this sentence.

Fritzsche is willing to take the ecology of fear even farther — to include host responses to infections with micro-organisms as well as micro-predators.

Running a temperature helps some amphibians fight parasites such as viruses and fungi. As cold-blooded animals, they can’t raise their temperature on their own, but some amphibians will go to the highest rocks where the sun burns brightest to acquire a “behavioral fever” that helps them fight these illnesses.

“Some people are reluctant to attribute this level of ‘awareness’ to wild animals,” Allan says, “but ecologists have established quite clearly that prey will go to great lengths to avoid predation. Given the substantial cost of parasitism to wildlife, it wouldn’t be surprising if hosts actively adjust their behaviors to reduce this burden.”

After all, it isn’t that different from washing your hands.

Total Eclipse Wyoming / August 21 Map and Visitor Info

Wyoming will be a popular choice for eclipse chasers because of the good weather prospects, Uncrowded highways (outside the Jackson Hole area), and good duration of totality.

The most dramatic scenery for this 2017 eclipse can be found in Wyoming. While Yellowstone National Park is just north of totality, Grand Teton National Park is squarely centered in the eclipse path and will surely be a magnet for eclipse observers.

CLICK FOR THE Wyoming STATE PAGE ON ECLIPSEWISE.COM. THE SITE PROVIDES THE MOST COMPREHENSIVE AND AUTHORITATIVE STATE PAGES FOR THE 2017 ECLIPSE. ECLIPSEWISE.COM IS BUILT BY FRED ESPENAK, RETIRED NASA ASTROPHYSICIST AND THE LEADING EXPERT ON ECLIPSE PREDICTIONS.

Mental Illness / Media and Mass Shootings

This is a long article: Go to: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC43182861

U.S. National Library of Medicine National Institute of Health

Mental Illness, Mass Shootings, and the Politics of American Firearms

Jonathan M. Metzl, MD, PhD and Kenneth T. MacLeish, PhD

Jonathan M. Metzl is with the Center for Medicine, Health, and Society and the Departments of Sociology and Psychiatry, Vanderbilt University, Nashville, TN. Kenneth T. MacLeish is with the Center for Medicine, Health, and Society and the Department of Anthropology, Vanderbilt University.

Only in the 1960s and 1970s did US society begin to link schizophrenia with violence and guns. Psychiatric journals suddenly described patients whose illness was marked by criminality and aggression. Federal Bureau of Investigation (FBI) most-wanted lists in leading newspapers described gun-toting “schizophrenic killers” on the loose,76 and Hollywood films similarly showed angry schizophrenics who rioted and attacked.77

Historical analysis14,78 suggests that this transformation resulted, not from increasingly violent actions perpetuated by “the mentally ill,” but from diagnostic frame shifts that incorporated violent behavior into official psychiatric definitions of mental illness. Before the 1960s, official psychiatric discourse defined schizophrenia as a psychological “reaction” to a splitting of the basic functions of personality. Descriptors emphasized the generally calm nature of such persons in ways that encouraged associations with poets or middle-class housewives.79 But in 1968, the second edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM)80 recast paranoid schizophrenia as a condition of “hostility,” “aggression,” and projected anger, and included text explaining that, “the patient’s attitude is frequently hostile and aggressive, and his behavior tends to be consistent with his delusions.”80(p34-36)

A somewhat similar story can be told about posttraumatic stress disorder (PTSD), another illness frequently associated with gun violence.15 From the mid-19th century though World War II, military leaders and doctors assumed that combat-related stress afflicted neurotic or cowardly soldiers. In the wake of the Vietnam War, the DSM-III recast PTSD as a normal mind’s response to exceptional events. Yet even as the image of the traumatized soldier evolved from sick and cowardly to sympathetic victim, PTSD increasingly became associated with violent behavior in the public imagination, and the stereotype of the “crazy vet” emerged as a result. In the present day, even news coverage drawing attention to veterans’ suffering frequently makes its point by linking posttraumatic stress with violent crime, despite the paucity of data linking PTSD diagnosis with violence and criminality.38,81

Evolutions such as these not only imbued the mentally ill with an imagined potential for violence, but also encouraged psychiatrists and the general public to define violent acts as symptomatic of mental illness. As the following section suggests,

the diagnostic evolution of schizophrenia additionally positioned psychiatric discourse as authoritative, not just on clinical “conditions” linking guns with mental illness, but on political, social, and racial ones as well.

WOW! A dangerous granting of authority to psychiatrists, (and indeed psychology and the social sciences) and further evidence that the “caring, fixing, helping” industry has taken on vast power to define individual “destinies” since the 1960s. Most Americans have no awareness that this shift in dominant authority has occurred and how negatively this philosophy of pan-human dysfunction has eroded the American quality of life.

_______________________________________________________

Whatever behavior is disapproved, becomes a ‘symptom’ of mental illness. That symptom can be attached to “acting black” or any other chosen origin of behavior.

Social arrogance? In the 1960-70s Black activism was promoted as 'mental illness" which could be "treated" with medication - Haldol.

Psychiatric racism: In the 1960-70s Black activism was promoted as a mental illness, which could be controlled by the application of Haldol.

In a 1969 essay titled “The Protest Psychosis,” psychiatrists postulated that the growing racial disharmony in the US at the height of the Civil Rights Movement was a  manifestation of psychotic behaviors and delusions afflicting America’s black lower class. “Paranoid delusions that one is being constantly victimized” resulted in black male anger and misplaced desire to overthrow the establishment.

 

Gray Matter Matters / Distinctions Between Autism and Asperger’s

Two articles that point to gray matter differences in people whose jobs require mental mapping and a paper concerning gray matter distribution in Autism and Asperger’s brain types

Boston Globe Online

Do our brains pay a price for GPS?

How a useful technology interferes with our ‘mental mapping’ — and what to do about it.  By Leon Neyfakh Globe Staff  August 18, 2013

Clip: “(Veronique) Bohbot, the McGill neuroscientist, started experimenting with navigation because of an interest in the way people’s brains change as a result of learning. Bohbot developed a method for using fMRI technology to distinguish between people who tended to find their way by going through a memorized list of step-by-step directions — what she calls “stimulus response strategy” — and those who were inclined to orient themselves by conjuring a mental map of the world around them. People who just follow directions, Bohbot found, tended to have less gray matter in their hippocampus, the part of the brain responsible for encoding spatial memories.”

“People whose everyday work is deeply dependent on mental mapping can show brain development that is particularly distinctive. A famous study published in 2000 by British neuroscientist Eleanor Maguire showed that taxi drivers in London with years of experience navigating the city’s complex geography had more gray matter in the posterior hippocampus compared to people who were not taxi drivers. The study underscores that how our brain works is subject to use; the brain is plastic, and the more mental mapping we do, the stronger our cognitive navigation skills and the bigger the part of the brain that encodes them.”

More at Boston Globe…

____________________________________________________________________________________________

 The comparison of gray matter volume as a plastic result of how one ‘uses’ the brain, led me to a study on PubMed:

Detailed article: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3201995/ 

Can Asperger syndrome be distinguished from autism? An anatomic likelihood meta-analysis of MRI studies

Kevin K. Yu, BSc,* Charlton Cheung, PhD,* Siew E. Chua, BM BCh, and Gráinne M. McAlonan, MBBS, PhD The autism research program in the Department of Psychiatry University of Hong Kong

0360412f1

Fig. 1 Cerebral grey matter differences in people with autism (top panel) and Asperger syndrome (bottom panel) compared with controls. Blue clusters represent less grey matter, whereas red clusters represent greater grey matter.

Conclusion

An ALE meta-analysis of grey matter differences in studies of Asperger syndrome or autism supports the argument against the disorder being considered solely a milder form of autism in neuro-anatomic terms. Whereas grey matter differences in people with Asperger syndrome are indeed more sparse than those reported in studies of people with autism, the distribution and direction of differences in each category is distinctive. Asperger syndrome involves clusters of lower grey matter volume in the right hemisphere and clusters of greater grey matter volume in the left hemisphere. Autism leads to more extensive bilateral excess of grey matter. Both conditions share clusters of grey matter excess in the left ventral temporal lobe components of the extrastriate visual system. This summary of a rich VBM MRI data set has important implications for how we categorize people on the autism spectrum and cautions that mixing individuals with autism and Asperger syndrome may at times obscure important characteristics manifested in one or the other condition alone.

_____________________________________________________________________________

 

 

 

 

 

 

 

 

The Myth of Choice / The American Consumer as Lab Rat

I have to say up front that on the “choice” curve (I’m sure someone somewhere has generated such a thing) I fall on the “less is more” end of “not enough choice – too many choices” preference. But I think this favorite topic of marketing behavior – which drives contemporary product design and marketing – misses the point.

The necessary question is, Does a product or service exist that works, or will at least be “good enough” to satisfy a current need or preference?

I was at the grocery store yesterday, standing in the OTC aisle, looking for a generic version of Benedryl: Why pay a high price for a brand name, when there is an identical product for much less, at least in theory, simply packaged in a less “zingy” cardboard and plastic package, at the end of the assembly line at the very same factory? It’s a go-to choice for me and millions of other shoppers…

Like other severe allergy sufferers (suffer is the correct word), relief is the goal, and the search becomes something of a seasonal religious pilgrimage from store to store, pharmacy to pharmacy, (I gave up on doctors long ago) aisle to aisle; a bewildering journey to find the holy object, concoction or magic pill, that brings even moderate relief. What one finds is a wall of pink and purple and yellow and orange and blue boxes tricked out with labels and messages; the ingredients in nano-type… how many times have I grabbed a product, thinking that it’s what I bought before, only to find when I get home, that it’s not…

What ought to be simple is not! The store has changed it’s product array; eliminated “my” standard choice, or rearranged the wall – moving key products, so that a “visual” location that was a good system for navigating the “too many choices” shelves is useless. Flagging down a clerk usually yields results like “the product is right in front of your face” look; or, the “corporate elves” moved everything again, according to some new marketing scheme, or we discontinued that item, or we’re out of stock, or more “excuse choices”.

One soon appreciates that old age is as much a “contrived” challenge as a steady decline in sensory acuity: confusion becomes a default experience in dealing with the world. Despite all the media attention about a rapidly aging population and the need to “address” the needs of the elderly, there is little “real world” response, except an increase in expensive, useless and faulty junk products marketed as “free” because insurance will “pay for” them.

Besides, someone my age isn’t supposed to use the words “old, elderly, impaired, slow, over-the-hill”, etc. but to defy Nature by spending hundreds of thousands of dollars a year jumping out of airplanes (don’t forget your parachute), communing with “natives” in the Amazon, and looking spry, well-preserved and young, while doing it, or gardening with one’s equally well-preserved spouse on a grand estate in “the islands” somewhere, purchased with vast resources that one managed to not spend over the previous forty+ years of “living”.

And yet the monomaniacal product machine grinds on: the effects are not negligible.

If the package didn’t “identify” this product, could you tell what it is by the “official” label?

 

Americans “waste” so much of what we buy, and time shopping, for food or household supplies, OTC “medications” and the rest. And reading “labels” is not much help when the ingredients in laundry detergent seem to “pop up” in juice drinks, sodas, “organic” baked goods and prepared and processed food: it’s ALL artificially produced at this point.

The fruits and vegetables and novelty “plant things” in the “green region” of our stores are just as artificial as the blue gummy stuff aimed at kids and sold as “vitamins” and “nutrition” supplements. The “money” spent to bring products to a store near you, or that are delivered to our homes from online sites, is allocated to overseas production, packaging, shipping, distribution, marketing, advertising, and endless conniving to make “junk” components, ingredients, materials, chemicals – and indeed dangerous or toxic fillers and additives, “appear to be” different products, but in reality, only the “decorative boxes” distinguish the same essentially worthless contents from each other.

I buy the same products over and over – based on price, past experience and “comfort” level. It’s one of those efficiencies in the face of absurdity that we Asperger’s are “slammed for” – dull, plodding, unimaginative, but on budget.

Choice is just another lie in the “culture” of consumerism. That is a hurdle that cannot be overcome by the consumer. So-called “education” merely confirms the distortions presented over decades of deception, manipulation, and indoctrination; of false labeling, intentional disregard of product safety or efficacy, and reduction of quality that is the pattern of “modern” corporate – government behavior.

Americans constitute a population of 300 million “lab rats” who are “blamed” as personally responsible for the “bad health outcomes” of experimentation on human subjects.

More and more often I find myself bewildered when shopping: Is there actually anything worth eating, or safe to eat, that can be purchased at the “grocery” store? But, being a “realistic” Asperger, whatever “damage” to my body from ingesting bizarre ingredients has occurred – no remedy for that. Time spent making “decisions” as to which “pretty package” to purchase is just that…a waste of time.

 

 

 

Best Neanderthal Reconstruction? Blake Ketchum, PhD

http://blakeketchum.com/index.php/art/category/reconstruction

Forbes Quarry specimen, Gibraltar: Discovered 1848. I added some rough hair. I know this is supposed to be a female Neanderthal, but it’s a reach to imagine what she would look like. I think with hair, this person looks remarkably like “us”.

Extinction by Dr. Blake Ketchum. Cast Stone. 30cm tall. Actually, “Extinction” seems an inappropriate title – this Neanderthal looks very “familiar” – a fellow human being.

This portrait is a faithful forensic reconstruction of a Neanderthal individual. A model of the Forbes Quarry cranium was used as the foundation for the sculpture. You can see a progressive development of the sculpture here. I used the Manchester Method of forensic reconstruction, which relies more upon anatomical geometry than tissue depth, which is not available for Neanderthals, of course. This sculpture has been shown widely at competition in international online exhibits and in galleries in NYC and throughout the North Eastern US. A cast is a part of the permanent collection of the Earth and Mineral Science Museum of Penn State University.

 

 

Eurocentric Anthropology / Will the real H. Sapiens please stand up?

Right: A composite computer reconstruction of fossils from Jebel Irhoud shows a modern, flattened face paired with an archaic, elongated braincase (Meaning this is an “idealized model” that may or may not accurately represent actual persons who lived in this location at the time referenced.)

Homo sapiens as “designed by” Anthropologists

Article Comment: Very problematic (as usual) with the “definition” of Homo sapiens “changing” with each fossil discovery or re-analysis. This current “archaic skull shape” and “modern flat-face” discovery seems a  flimsy basis for vital species description and is indeed Eurocentric! No surprise there. But where does that leave contemporary humans who do not have flat faces? Non-flat (under or over prognathic) faces are generally labeled as “deformities” … which leaves a considerable percentage of modern humans as either “archaic proto-Homo sapiens” or as “deformed” modern Homo sapiens. And what about “dolichocephalic” (long, narrow head) modern humans? What are they? Are aboriginal Australian people “Homo sapiens” or not? African Americans?

From: SCIENCE / AAAS

World’s oldest Homo sapiens fossils found in Morocco

(edited for length)

By Ann Gibbons Jun. 7, 2017

For decades, researchers seeking the origin of our species have scoured the Great Rift Valley of East Africa. Now, their quest has taken an unexpected detour west to Morocco: Researchers have redated a long-overlooked skull from a cave called Jebel Irhoud to a startling 300,000 years ago, and unearthed new fossils and stone tools. The result is the oldest well-dated evidence of Homo sapiens, pushing back the appearance of our kind by 100,000 years.

The discoveries, reported in Nature, suggest that our species came into the world face-first, evolving modern facial traits while the back of the skull remained elongated like those of archaic humans. (Neanderthals,  H. erectus, Denisovans)

Back in 1961, miners searching for the mineral barite stumbled on a stunningly complete fossil skull at Jebel Irhoud, 75 kilometers from Morocco’s west coast. With its big brain but primitive skull shape, the skull was initially assumed to be an African Neandertal. In 2007, researchers published a date of 160,000 years based on radiometric dating of a human tooth.

At Herto, in Ethiopia’s Great Rift Valley, researchers dated H. sapiens skulls to about 160,000 years ago; farther south at Omo Kibish, two skullcaps are dated to about 195,000 years ago, making them the oldest widely accepted members of our species, until now.

Some researchers thought the trail of our species might have begun earlier. After all, geneticists date the split of humans and our closest cousins, the Neandertals, to at least 500,000 years ago, notes paleoanthropologist John Hawks of the University of Wisconsin in Madison. So you might expect to find hints of our species somewhere in Africa well before 200,000 years ago, he says.

One of the few people who continued to ponder the Jebel Irhoud skull was French paleoanthropologist Jean-Jacques Hublin…

The team now has new partial skulls, jaws, teeth, and leg and arm bones from at least five individuals, including a child and an adolescent, mostly from a single layer that also contained stone tools. In their detailed statistical analysis of the fossils, Hublin and paleoanthropologist Philipp Gunz, also of the Max Planck in Leipzig, find that a new partial skull has thin brow ridges. (Is this an actual skull or a reconstruction from statistical data?) And its face tucks under the skull rather than projecting forward, similar to the complete Irhoud skull as well as to people today. But the Jebel Irhoud fossils also had an elongated brain case and “very large” teeth, (as do many H. sapiens today) like more archaic species of Homo, the authors write.

New dates and fossils from Jebel Irhoud in Morocco suggest that our species emerged across Africa. The new findings may help researchers sort out how these selected fossils from the past 600,000 years are related to modern humans and to one another. (Or will result in more of the same “socio-academic” arguments over which “fossil humans” are WORTHY of being the ancestors of modern European H. sapiens)

The fossils suggest that faces evolved modern features before the skull and brain took on the globular shape (how magical!) seen in the Herto fossils and in living people. “It’s a long story—it wasn’t that one day, suddenly these people were modern,” Hublin says.

Neandertals show the same pattern: Putative Neandertal ancestors such as 400,000-year-old fossils in Spain have elongated, archaic skulls with specialized Neandertal traits in their faces. “It’s a plausible argument that the face evolves first,” (Really?) says paleoanthropologist Richard Klein of Stanford University in Palo Alto, California, although researchers don’t know what selection pressures might drive this. (Actually, many scenarios have been put forth as to changes in face shape – from  changes in diet to a trend toward “gracile neoteny”)

The traditional anthropological arguments follow:

This scenario hinges on the revised date for the skull, which was obtained from burnt flint tools. (The tools also confirm that the Jebel Irhoud people controlled fire.) Archaeologist Daniel Richter of the Max Planck in Leipzig used a thermoluminescence technique to measure how much time had elapsed since crystalline minerals in the flint were heated by fire. He got 14 dates that yielded an average age of 314,000 years, with a margin of error from 280,000 to 350,000 years. This fits with another new date of 286,000 years (with a range of 254,000 to 318,000 years), from improved radiometric dating of a tooth. These findings suggest that the previous date was wrong, and fit with the known age of certain species of zebra, leopard, and antelope in the same layer of sediment. “From a dating standpoint, I think they’ve done a really good job,” says geochronologist Bert Roberts of the University of Wollongong in Australia.

Once Hublin saw the date, “we realized we had grabbed the very root of the whole species lineage,” he says. (!!!) The skulls are so transitional that naming them becomes a problem: The team calls them early H. sapiens rather than the “early anatomically modern humans” described at Omo and Herto. 

(Words again – word labels and categories – but where is the functional “reality”?)

Some people might still consider these robust humans “highly evolved H. heidelbergensis,” (arguments continue as to whether or not this is a made up or actual species!) says paleoanthropologist Alison Brooks of The George Washington University in Washington, D.C. She and others, though, think they do look like our kind. “The main skull looks like something that could be near the root (now there’s a fine example of specific concrete scientific language LOL Note that “root” refers to the popular “tree diagrams” of evolutionary relationships which are “archaic” at this point, and not up to date with current understanding of actual evolution…)  of the H. sapiens lineage,” says Klein, who says he would call them “protomodern, not modern.” (Aye, yai, yai! Habitual “word construct academic thought structures” are simply inadequate to understanding evolution)

The team doesn’t propose that the Jebel Irhoud people were directly ancestral to all the rest of us. (LOL Will any fossil humans ever be “good enough” to be the direct ancestors of European anthropologists?) Rather, they suggest that these ancient humans were part of a large, interbreeding population that spread across Africa when the Sahara was green about 300,000 to 330,000 years ago; they later evolved as a group toward modern humans. (Misty magic is invoked again) H. sapiens evolution happened on a continental scale,” Gunz says.

Support for that picture comes from the tools that Hublin’s team discovered. They include hundreds of stone flakes that had been hammered repeatedly to sharpen them and two cores—the lumps of stone from which the blades were flaked off —characteristic of the Middle Stone Age (MSA). Some researchers thought that archaic humans such as H. heidelbergensis invented these tools. But the new dates suggest that this kind of toolkit, found at sites across Africa, may be a hallmark of H. sapiens. (Murky mental picture)

The finds will help scientists make sense of a handful of tantalizing and poorly dated skulls from across Africa, each with its own combination of modern and primitive traits. For example, the new date may strengthen a claim that a somewhat archaic partial skull at Florisbad in South Africa, roughly dated to 260,000 years ago, may be early H. sapiens. But the date may also widen the distance between H. sapiens and another species, H. naledi, that lived at this time in South Africa.

The connections among these skulls and the appearance of MSA tools across Africa at this time and possibly earlier shows “a lot of communication across the continent,” Brooks says. (Communication is a popular and contagious theme in “communication-crazy” smart phone- era culture) This shows a pan-African phenomenon, with people expanding and contracting across the continent for a long time.” (Novel writing is the true calling of archaeologists and anthropologists and so-called science writers) 

_______________________________________________

The original paper: Nature

New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens

Jean-Jacques Hublin, (see original for all authors)

Abstract

Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure because the fossil record is scarce and the chronological age of many key specimens remains uncertain. In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established. Furthermore, it shows that the evolutionary processes behind the emergence of H. sapiens involved the whole African continent.

Comment: Whatever happened to reproductive success as the distinctive basis for a species division? 

Homo sapiens did interbreed with “archaic humans” after all; is homo sapiens a “real” species, or a “breed” of an earlier foundational Homo species? It seems that in Eurocentric anthropology, at least, Homo sapiens is a morphologic type and not a species at all.

Where is reproduction of viable offspring in this “tangled obsession” with “body parts” as signs and portents in Eurocentric narcissism? Homo sapiens simply “looks like us”! How does “flat face” versus “prognathic face” impact “having offspring” – unless, of course, you are racist?

Face “shape” may be determined by factors having nothing at all to do with what is vital to being a modern Homo sapiens!

 

 

Genetic Literacy Project / No increase in “Autism” – Mystery Solved

See post: “The $9,000,000,000 U.S. Autism Industry” That’s $9 BILLION

No, it’s not vaccines, GMOs glyphosate–or organic foods

Clip from Genetic Literacy Project:

So what does the latest evidence show? There now is intriguing evidence that there in fact has been no dramatic rise in autism after all. According to a just-released study, scientists at the Aarhus University, in Aarhus, Denmark assessed more than 670,000 children born between 1980 and 1991 in Denmark, following them from their birth until they were diagnosed with Autism Spectrum Disorder, died, emigrated or reached the end of the study period which was December 2011. Among other things, Denmark is renowned for its excellent national medical records system, which allowed them to conduct a study of this magnitude and over the extended time span. Among the population studied, 4,000 children were diagnosed as being along the autism spectrum and many of these diagnoses were made after 1995. Look at what happened just before that detected increase. Tara Haelle reports in Forbes

In Denmark in particular, the diagnostic criteria for autism expanded in 1994 to include a spectrum of disorders with a broader list of symptoms, thereby widening the definition of autism. Then in 1995, national data tracking began to include diagnoses made from outpatient patient visits rather than just diagnoses of those admitted to a healthcare facility.

The exact same thing has happened in every country that has seen soaring autism rates–the definition of what constitutes autism was dramatically expanded in the early 1990s to embrace the catch-all term Autism Spectrum Disorder–correlating with when GMO usage, chemtrail rates, pesticide exposure and organic food sales began a sharp increase.

The researchers discovered that the change in diagnostic criteria taken together along with the diagnoses made outside of a healthcare facility accounted for as much as 60 percent of the increase in prevalence of autism spectrum disorders. The authors of the study conclude thus

Changes in reporting practices can account for most (60 percent) of the increase in the observed prevalence of ASDs in children born from 1980 through 1991 in Denmark. Hence, the study supports the argument that the apparent increase in ASDs in recent years is in large part attributable to changes in reporting practices.

Though this in itself doesn’t mean evidence of a lack of increase in the prevalence of autism, it does say very emphatically that the huge uptick in numbers of autistic children diagnosed have more to do with how we diagnose the condition than an actual increase.

The idea that increased diagnosis contributes to higher prevalence of a disease is not new at all. In fact it is quite common especially as new diagnostic techniques come into play and early screening programs are put in place by governments. This often leads to debates in the medical literature about whether increases in prevalence of disease are real or due to an increase in diagnosis. Prostate cancer is a common example –incidence for prostate cancer jumped over 100 percent from 1986 to 1992 which coincided with an aggressive expansion of the prostate cancer screening program based on the Prostate Specific Antigen (PSA) test which was approved by the FDA in 1986.

The result of the autism study, even if somewhat expected is still very important. The quality of Denmark’s health records and the size of the study make it unique –it makes the data extremely robust and reliable. So how do these results translate to the United States? We do have similarities in how the diagnosis has changed writes Tara Haelle in Forbes

The way autism is defined in the U.S. has changed dramatically since 1980, when it first appeared in the DSM-III as “Infantile Autism” and could only be diagnosed in children whose symptoms began before they were three years old. Autism spectrum disorders have expanded to include diagnosis without a specific age requirement beyond the “early developmental period” and without requiring significant language impairment in the recently revised DSM-5.
The vast majority of people diagnosed with autism spectrum disorders today would never have qualified under the 1980 classification, and no formal classification separate from schizophrenia existed before then. So it’s not surprising that numbers have increased in the U.S.
Note how the deletion of “language impairment” whimsically redefined Asperger’s as “Autism” – a socially motivated  decision which is not based in scientific evidence.

There are many possible causes why there is an increase in the prevalence of a disease. Apart from increased screening and changes in diagnostic criteria, factors like increased awareness will also come into play. As credible scientific efforts around the world continue to identify genetic and/or environmental causes behind autism, it is prudent to not be taken in by wild claims and give in to the fears spread by those who accept and promote pseudoscience. (Such as psychologists and psychiatrists and the Autism Industry) And no, the Wi-Fi in your house or the genetically modified foods you eat will not lead to your child becoming autistic.

Arvind Suresh is a science communicator and a former laboratory biologist. Follow him @suresh_arvind

New Research / Female Hips and Running, Birth

 

john hawks weblog

If the basic assumptions of the obstetric dilemma are right, says Lewton, participants with wider hips should run and walk less efficiently than those with narrow ones. But that wasn’t what Lewton and her team found. Instead, they found no connection at all between hip width and efficiency: wide-hipped runners moved just as well as their narrow-hipped peers. Lewton and her colleagues published their results in March 2015 in the online journal PLOS ONE. The work was supported by grants from the National Science Foundation and The Leakey Foundation.

“This ‘trade-off’ between hips wide enough for a big baby and small enough for efficient locomotion does not seem to occur,” says Lewton. “That means that we have to rewrite all of the anthropology textbooks! Even outside of textbooks, the general public thinks that if your hips are wide, you’re a bad biped, and that does not seem to be the case.”

The research paper discussed in the article is the recent one by Anna Warrener and colleagues, “A Wider Pelvis Does Not Increase Locomotor Cost in Humans, with Implications for the Evolution of Childbirth”.

Reference

Warrener AG, Lewton KL, Pontzer H, Lieberman DE (2015) A Wider Pelvis Does Not Increase Locomotor Cost in Humans, with Implications for the Evolution of Childbirth. PLoS ONE 10(3): e0118903. doi:10.1371/journal.pone.0118903

Left: "Lucy" Australopithecus afarensis 3.2 mya; H. erectus female "Gona" 1.2 mya; modern H. sapins female.

Left: “Lucy” Australopithecus afarensis 3.2 mya Center: H. erectus female 1.2 mya Right: modern H. sapiens female.