Making Stone Tools / A Non-verbal Process

A super group of videos…

Not a single word is needed to do this or to teach someone else to do this. The “tips” at the end would be demonstrated during the process. Children would see tools and other objects made day in and day out and would naturally copy their elders.

Archaeologists go on and on about how it takes “advanced cognitive skills” (like those needed to push around a shopping cart and swipe a credit card) to create stone tools. I have yet to hear a single researcher mention visual thinking. You can babble at a pile of stones, or another human, all day long, but all that yack-yacking will not produce one stone tool. The earliest stone tools are millions of years old; sophisticated flaked tools (Acheulean) were invented by Homo erectus, not Homo sapiens. Some research indicates that ‘language’ structure had its beginnings in sign language and not in vocalization. Pre and early humans were visual observers,  inventors and communicators – and not at all like modern social humans, who are a very recent “neotenic” variation of Homo sapiens.

All it takes is A FEW adept individuals to preserve techniques and to pass on skills. If a group were lucky, one “genius” might come up with improvements and refinements so that technical advancement could occur – which would probably be forgotten and reinvented many times. And critically, resources in one’s environment dictated solutions: nomadism provided exposure to new raw materials and new people, so “itchy feet” were likely more advantageous than staying in one place too long.

 

 

Lesson about Nature and Evolution / My Pitiful Garden

These photos sum up “flora” in the big scrubland that is southwestern Wyoming and the other desert basins, which compose the “basin and range” geography the American West.

A “lush” scene of sage brush, snakeweed, bunch grasses and weeds in late summer, at just the right moment in evening, when the sun produces “color” in the landscape. I used to differentiate “weeds” from native plants, but just what is native to this place? The idea is a bit preposterous: any “blown in” in or tracked in seed that manages to establish a colony is legitimate: survival, not “social labels” become the measure of successful plant life (and people).

When I moved to town 22 years ago, the yard of the house was packed mud, as were many other “lawns”. I wanted to do something with this blank canvas, but didn’t have any money to spare or a place to by “real” plants, shrubs, trees and perennials. Any existing landscaping in the neighborhood was typical – lawns and lilacs; fir trees and spring fruit trees that bloomed like crazy but were not orchard varieties; no edible fruit produced. Chinese elms (now that’s a weed!) proliferate in town, along with renegade forms of domestic plants gone wild – planted many decades ago, the most rugged specimens have been selected by the merciless environment and are all but indestructible.

Up close, the countryside reveals some lovely plants; not at all like domestic garden types, but interesting. And of course, rocks; a never-ending supply of metamorphic cobbles washed down from the mountains during glaciations and rounded and polished to perfection, and slabs of sandstone broken out of deep outcrops by freeze and thaw leverage, and strewn about by gravity. These I dragged home, a few each day, as I wandered around getting to know my new homeland.

Metamorphic “texture” visible in a river cobble

Fossil rain drops prints – small puddles in sandstone

I learned that sage brush can be transplanted; necessity revealed how to do it. A single large sage brush is impossible to dig up. The roots extend for many feet – a tap root straight down into the earth and many more that travel under the surface horizontally. But, the small extensions that pop up around the main plant can be easily pulled from sandy areas with roots intact. I literally planted dozens of these, to ensure that one or two survived. I didn’t amend or improve the soil (there wasn’t any). The differences between “wild and domestic plants” was soon obvious; how much water would the sage and Artemisia, globe mallow and flax, and unidentified “others” tolerate? And which plants would simply not transplant at all, and require starting from seeds?

Shadscale leaf – flower – seed

A weed that I used to bad-mouth until I discovered that it’s seeds feed a host of small birds all winter.

I collected desirable seeds whenever they appeared; the desert plants have their own timing due to the sporadic delivery of rain, so I stripped handfuls – some plants produce seeds that look like small flowers or parts of branches. In imitation of “nature” I tossed them randomly about the lot and forgot about them. It was then a surprise to discover something growing at all.

Eventually a proper nursery opened in town, where I supplemented the donations of iris from neighbors, and I fell into the “domestic trap” of wanting cut flowers. At first, perennials thrived, especially ground covers rooted between cobbles that made up the “rock garden” and other traditional flower producers. But – plants that are perennials in less tortured climates proved to be biennials in most cases; even hardy iris, having their tubers or roots “freeze-dried” by our cold dry winters.

My questions about the ubiquitous limited landscaping in town were quickly answered: “It’s the climate stupid” so I replaced whichever plants died with those that didn’t, and with ever more rock and gravel, and with evergreen shrubs, which adapted well. Flowers became annuals in pots; waterproof pots. Traditional clay pots simply turn into tombs for the mummies of their once-living contents. I literally abandoned the front yard to vegetation that never needs additional water. It’s literally a “what grows, grows” plot just like the countryside. And, deer eat their choice of favorite vegetation.

I must mention that all of this trial and error gardening is only possible due to the city being a “hands off” regime; they do occasionally cut weeds along the parkway and alleys, but budget cuts have all but eliminated even this activity. Years ago an overeager teenage employee with a weed-whacker interpreted my parkway landscaping as weeds and reduced the area to a crew cut – pitiful! But one irate phone call to the city and a letter to the editor of the local paper produced a “vow” that no one would come near my house again. That’s responsive government.

This year, the “garden” is down to a few pots in the back yard, and even these typical annuals are struggling with two “fill ups” of water every day. The sun at 6100′ feet actually burns leaves to a crisp, and along with temps in the high 80s – 90s and 10-15% humidity, the wind sucks the moisture from every living and non-living substance. It’s discouraging. But it’s a lesson in reality that should be obvious to all human beings, now that the earth is changing dramatically; global warming and cooling are typical, and periodically extreme in climate history. Much of the earth’s surface is uninhabitable by humans: that’s a fact that has only temporarily been overcome by massive water management, diversion and reckless depletion. That inhospitable area is increasing and shifting latitude northward and southward and in ways that are unpredictable, given the complexity of the physics and chemistry involved.

When “visiting” my poor beleaguered pots of plants this morning, I realized that “adaption” to this harsh place has not been a matter of trying to bend it to my will, but to let it change me. That’s a good thing. The changes coming to earth are normal and inevitable; so is human stupidity, so I have no illusion that nature will move on, with or without us, just has it always does.

 

 

New Experience / Academic insanity meltdown

I’m feeling physically ill this morning; stayed up late subjecting myself to the content of  a “scientific” paper that is the worst pile of crap I’ve ever encountered – published in a serious British journal. The subject: Social evolution of humans. The “line of thinking” is so outrageous, so intellectually offensive, that I would call it pornographic: intellectual porn.

A criminal use of the human brain.

I intended to expose this paper, but it had such a disturbing effect that I couldn’t continue with a critique. My point is, that I’ve discovered this “feeling” in myself of “insult by intellectual attack” and I have no word for it. (I bet the Germans do)Something like a meltdown; an attack on sanity delivered by “thought pollution” and not by sensory overload. And I don’t mean a personal attack, but that the assumptions and assertions made and represented as “scientific” work were published by a top journal, as if no one noticed the absurdities.

I even thought momentarily that the paper was an intentional monstrosity, “planted” to test the (corrupt?) review process of some science publishers…so went looking for more papers using search words that were “ungoof- upable” even by google. OMG! The paper was not a “fluke”.

I did encounter a review of the paper and its ideas by a scientist in the same field and it was “politely” scathing – about as close to a tirade as a review can get. It should have made me feel better. It didn’t, because the paper’s writers are established “prestigious” academics, not “ancient alien” conspiracy crackpots – but crackpots within the sciences.

Am I overreacting? I would say not, because this paper served as the “trigger” for the cumulative response to a lifetime of encounters with “nonsense” as the prevailing trend in modern thought. That is, it is the difference between “studying” earthquakes and being in the zone of destruction when the earth “slips” violently – and suddenly, physically, viscerally one experiences the full meaning of danger.

It’s a “Bhuddist” moment for me.

 

 

Infant Synesthesia / A Developmental Stage

No, synesthesia is not a symptom of disorder, but it is a developmental phenomenon. In fact, several researchers have shown that synesthetes can perform better on certain tests of memory and intelligence. Synesthetes as a group are not mentally ill. They test negative on scales that check for schizophrenia, psychosis, delusions, and other disorders.

Synesthesia Project | FAQ – Boston University

________________________________________________________________

What if some symptoms “assigned” by psychologists to Asperger’s Disorder and autism are merely manifestations of synesthesia?

“A friend of mine recently wrote, ‘My daughter just explained to me that she is a picky eater because foods (and other things) taste like colors and sometimes she doesn’t want to eat that color. Is this a form of synesthesia?’ Yes, it is.” – Karen Wang

We see in this graphic how synesthesia is labeled a “defect” that is “eradicated” by normal development (literally “pruned out”). People who retain types of integrated sensory experience are often artists, musicians, and other sensory innovators (like chefs, interior designers, architects, writers and other artists) So, those who characterize “synthesia” as a developmental defect are labeling those individuals who greatly enrich millions of human lives as “defectives”. – Psychology pathologizes the most admired and treasured creative human behavior.

No touching allowed! Once “sensory” categories have been labeled and isolated to locations in the brain, no “talking to” each other is allowed. The fact that this is a totally “unreal” scheme is ignored. Without smell, there IS NO taste…

________________________________________________________________

Infants Possess Intermingled Senses

Babies are born with their senses linked in synesthesia

originally published as “Infant Kandinskys”

What if every visit to the museum was the equivalent of spending time at the philharmonic? For painter Wassily Kandinsky, that was the experience of painting: colors triggered sounds. Now a study from the University of California, San Diego, suggests that we are all born synesthetes like Kandinsky, with senses so joined that stimulating one reliably stimulates another.

The work, published in the August issue of Psychological Science, has become the first experimental confir­mation of the infant-synesthesia hy­pothesis—which has existed, unproved, for almost 20 years.

Researchers presented infantsand adults with images of repeating shapes (either circles or triangles) on a split-color background: one side was red or blue, and the other side was yellow or green. If the infants had shape-color asso­ciations, the scientists hypoth­esized, the shapes would affect their color preferences. For in­stance, some infants might look significantly longer at a green back­ground with circles than at the same green background with triangles. Absent synesthesia, no such dif­ference would be visible.

The study confirmed this hunch. Infants who were two and three months old showed significant shape-color associations. By eight months the preference was no longer pronounced, and in adults it was gone altogether.

The more important implications of this work may lie beyond synesthesia, says lead author Katie Wagner, a psychologist at U.C.S.D. The finding provides insight into how babies learn about the world more generally. “In­fants may perceive the world in a way that’s fundamentally different from adults,” Wagner says. As we age, she adds, we narrow our focus, perhaps gaining an edge in cognitive speed as the sensory symphony quiets down. (Sensory “thinking” is replaced by social-verbal thinking)

(Note: The switch to word-concept language dominance means that modern social humans LOOSE the appreciation of “connectedness” in the environment – connectedness becomes limited to human-human social “reality” The practice of chopping up of reality into isolated categories (word concepts) diminishes detail and erases the connections that link detail into patterns. Hyper-social thinking is a “diminished” state of perception characteristic of neurotypicals)

This article was originally published with the title “Infant Kandinskys”
________________________________________________________

GREAT WEBSITE!!!

The Brain from Top to Bottom

thebrain.mcgill.ca/

McGill University
Explore topics such as emotion, language, and the senses at five levels of organization (from molecular to social) and three levels of explanation (from beginner … advanced)

Intuition as an Analytical Tool / It’s not “magic”

As an intuitive thinker, I have my own ideas about ‘what it is” and “how it works” and I agree with the author that the “split” between “analysis by logic” and “analysis by intuition” is imaginary; a product of the “neurotypical” fixation on black OR white, left OR right, male OR female, good OR evil, normal OR abnormal.

Complexity simply cannot be “grasped” by the polarized “social” brain. Hence, the insistence that “complexity” cannot be organized, analyzed or comprehended. “Intuition” is vital to dealing with complexity, but it is typically thought of as akin to “magic” – something that “happens” in the stomach or intestines (how gross and ridiculous) with bright ideas (usually signified by a lightbulb) – is there a lightbulb in one’s stomach or intestines?

This is how I arrange “thinking” –

Visual thinking is the primary “intuitive-instinctive” brain function. Images are the “units” of thinking; patterns and connections are the style. We can think of intuition as visual language. Humans “share” visual thinking with other animals. Some animals are “olfactory” thinkers or “acoustic thinkers” or “name the sensory apparatus the animal relies on most” thinkers. These are “instinctive” languages.

Verbal thinking is a recent function that is specific to humans: it is the “conscious” brain function. “Conscious” thinking IS USING WORDS TO THINK. Try thinking without words. Word thinking is generalized – social, and cultural. Structure for verbal language is present as “potential” in the brain, but word language must be learned.

Abstract thinking is “math and symbol systems language”. Are “maths” a human invention, or the fundamental code that “writes the universe into existence”?

I wanted to see what people in unrelated fields think about intuition – this article is from a business perspective.

One comment: In my experience, intuition is “naturally” necessary in “emergency” situations – like deadlines – when “automatic” (instinctive) analysis is by far the fastest – split second, result-producing function. For me, verbal thinking is a “dull, plodding chore” in comparison. The result is that I’m at my best (thinking-wise) in the “critical present” and useless in linear “planning”.  Business is simply “a bore” so let’s see what a business person has to say…

http://analytics-magazine.org/forum-intuition-based-decision-making-the-other-side-of-analytics

Forum: Intuition-based decision-making: The other side of analytics

March/April 2015

By Jay Liebowitz

In the fall 2014 issue of Johns Hopkins Magazine, Aneesh Chopra (the first U.S. chief technology officer) said, “When it comes to making major decisions, there are two camps. One consists of people who believe intuition trumps analysis – go with your gut. The other rejects intuition in favor of careful data analysis – where there is enough data, there’s no need for intuition. The ideal is a marriage of the two.”

I also feel that we need a complementary set of both analytics and intuition, and I would like to focus on the latter part, which hasn’t been discussed much in the analytics community and conferences.

According to Tim Cook, CEO of Apple, “Intuition is something that occurs in the moment, and if you are open to it, if you listen to it, it has the potential to direct or redirect you in a way that is best for you.”

Said Albert Einstein: “The really valuable thing is intuition.”

According to Betsch (2008), intuition is a process of thinking whereby the input is processed automatically and without conscious awareness, resulting in a feeling that can serve as a basis for judgments and decisions. In experiments with shares in the stock market, Betsch (2008) found that most of the participants said they relied on their “gut reaction” or “intuitive feeling” when judging the shares. Similar conclusions were also reached in political judgment and other domains.

Many people believe intuition is an instinctive “knowing” without the support of logic, analysis or actual evidence (Liebowitz, 2014b; Dorfler and Ackermann, 2012; Greengard, 2012; Heskett, 2013; Hensman and Sadler-Smith, 2011; Williams, 2012; Woiceshyn, 2009). But in reality, intuition is founded upon the scrutinizing of failures and lessons learned. So, to characterize intuition, it typically is: experience-driven, holistic, affective, quick and non-conscious (meaning that it is difficult to trace a logic trail to the decision). (Meaning that it is non-verbal)

Intuitive decision-making is one’s ability to recognize patterns at lightning speed – a process that often happens unconsciously (Matzler et al., 2007). In an experiment to reposition 25 pieces in chess after examining them for a few seconds, the inexperienced chess players located an average of only six of the original positions. However, the chess master correctly replaced all 25 pieces. (Visual thinking)

According to Dane and Pratt (2012), intuition may be just as effective in decision-making as an analytical approach – and sometimes more efficient and effective, depending on the decision-maker’s level of expertise on the subject at hand. Dane and Pratt (2012) further state that if you’re working in an industry where you have risen through the ranks, your domain expertise will likely better serve an intuitive approach. If you gained your expertise in a different field, you may not have the background to rely as strongly on your intuition.

Perhaps those from MIT and Austria (Matzler et al., 2007) said it best: “For many complex decisions, all the data in the world can’t trump the lifetime’s worth of expertise that informs one’s gut feeling, instinct, or intuition.” In their research, they talk about honing an executive’s intuition. Specifically, cultivating instinct requires the following factors: experience, networks, curiosity, tolerance, emotional intelligence (from the leadership research, emotional intelligence is the key differentiator from successful leaders and those who are not) and limits (like any good thing, a reliance on intuition can be taken to extremes – executives should reflect on their intuitive decisions before they execute them).

According to Parkinson (2014), “We generally have good intuition about things that are similar to what we encounter every day, and are able to make ‘instinctive’ decisions (based on comparisons with our experience) that are generally correct. But we have poor intuition about things that are outside of everyday experience and very poor intuition about things that are totally alien.” (Instinct provides “evolutionary experience” – tremendous depth of time and testing go into “instincts” – packaged and “ready to work” for us. We tend to ignore this incredible resource)  

Certainly, intuition has its disadvantages. In speaking at the 2014 Canada’s Best Managed Companies Conference, Dr. Salman Mufti, executive education director at the Queen’s University School of Business in Canada, cautions us to perhaps start with your intuition, but validate or verify it. When I worked with some auditors, their credo was, “trust but verify” (sounds somewhat similar). Peter Drucker, one of the fathers of management, said that we shouldn’t be “hunch artists,” rather we should “believe in intuition only if you discipline it.”

Perhaps we should think of intuition being based on analysis and experience, and we should apply “rational intuition” (Heskett, 2010) to our management decision-making. I believe we need to educate “informed intuitants,” as I pointed out in a column in the SAS Exchange (Liebowitz, 2014a). The CEB (Corporate Executive Board)(2013) talks about the necessary skills as applied to analytics, such as problem-solving, intellectual curiosity, issue diagnosis, insight generation, synthesis of internal and external data, problem-framing, and synthesis of financial and qualitative data. But there are other skills integral to the informed intuitant (ouch!) (Liebowitz, 2014a), including:

§  collaboration abilities, such as team building, project management and interpersonal communications (oral and written);

§  creativity-enhancing skills to think outside the box;

§  business-speak, summarization and data visualization techniques for the analyst to explain their results to C-level executives; and

§  learning by doing or testing by learning methods to sharpen the analytical and decision-making skill sets.

In looking at some of the technical journal research, intuition plays a key role in decision-making worldwide. For example, Bocco and Merunka (2013) reported research of small- and medium-sized enterprises (SMEs) in Africa, where more than 300 managers and entrepreneurs at SMEs revealed that intuition is a key resource for managerial decision-making. Loechner (2014) reported a study where even people who think of themselves as data-driven decision-makers (i.e., “I collect and analyze data as much as possible before making a decision”), also place trust in their own intuition. According to the study, 73 percent of executives surveyed said they trust their own intuition when it comes to decision-making, and, even among the data-driven decision-makers, 68 percent agree with that statement. (Hmmm…this “self-analysis” can be faulty – if one “trusts one’s intuition” but doesn’t examine the results of one’s intuition-driven decisions, then intuition may not yield good outcomes) 

Interestingly, research from Dartmouth (Kyung and Thomas, 2013) showed that you couldn’t rely on your intuition if given negative feedback. In one of the experiments, the researchers gave false negative feedback to half the participants by telling them they were wrong even when their answers to some questions were correct. Subjects whose confidence had been disrupted by negative feedback lost the relative accuracy advantage from relying on their intuition (Kyung and Thomas, 2013). (Social typicals may therefore not be able to develop intuition, whereas “Asperger-types” are  not likely to be “swayed” by negative feedback and will stick to their answers or observations!) 

Some research is being done now to assess one’s intuition, such as the CEB’s Insight IQ instrument (CEB, 2013). They found that 19 percent of more than 5,000 managers in major global companies are “visceral decision-makers” who rely almost exclusively on intuition.

So, what can be done to improve our decision-making from a business intuition perspective? Dimitrius and Mazzarella (2008) offer some suggestions:

§  Recognize and respect your intuition, not following it blindly or rejecting it outright. (Either – or again – I find it to be more like “surfing” a wave)

§  Identify what your intuition is telling you. Follow the hunch, asking what is it?

§  Review the evidence by playing back the events in order to become more conscious of the signs.

§  Prove or disprove your theory. Gather additional information to consciously test your theory.

For some reason, much of the intuition in management research is being done in Europe (United Kingdom) and Australia. Not much in the United States per se. If we want to improve this area of research, there are some recommendations for advancing the current state-of-the-art, as highlighted by Akinci and Sadler-Smith (2012) and Sinclair (2014):

§  careful conceptual framing;

§  greater cross-disciplinary collaboration and integration;

§  increased methodological rigor and pluralism; and

§  closer attention to levels of analysis issues.

Here are some guidelines (Sadler-Smith and Shefy, 2004) to help further develop your intuitive awareness:

§  open up the closet (be amenable to count on intuitive judgments);

§  don’t mix up your “I’s” (instinct, insight and intuition);

§  elicit good feedback;

§  get a feel for your batting average (benchmark your intuitions);

§  use imagery rather than words;

§  play devil’s advocate; and

§  capture and validate your intuitions.

Intuition may breed innovation. If you are tied only to statistics and analytics, you may miss new and better strategies, opportunities and methods for your business to be more efficient, effective and successful. Use your informed intuition, founded upon years in business and inspired by trends in big data, to navigate the future of your business.

Jay Liebowitz (jliebowitz@harrisburgu.edu) is the DiSanto Visiting Chair in Applied Business and Finance at the Harrisburg University of Science and Technology.

Debunking Left Brain, Right Brain Myth / Paper – U. Utah Neuroscience

An Evaluation of the Left-Brain vs. Right-Brain Hypothesis with Resting State Functional Connectivity Magnetic Resonance Imaging

Jared A. Nielsen , et al, Affiliation Interdepartmental Program in Neuroscience, University of Utah, Salt Lake City, Utah, United States of America (See original for authors and affiliations)

Published: August 14, 2013

https://doi.org/10.1371/journal.pone.0071275 (Extensive paper with loads of supporting graphics, etc.) (Heavy going technical paper)

Abstract

Lateralized brain regions subserve functions such as language and visuospatial processing. It has been conjectured that individuals may be left-brain dominant or right-brain dominant based on personality and cognitive style, but neuroimaging data has not provided clear evidence whether such phenotypic differences in the strength of left-dominant or right-dominant networks exist. We evaluated whether strongly lateralized connections covaried within the same individuals. Data were analyzed from publicly available resting state scans for 1011 individuals between the ages of 7 and 29. For each subject, functional lateralization was measured for each pair of 7266 regions covering the gray matter at 5-mm resolution as a difference in correlation before and after inverting images across the midsagittal plane. The difference in gray matter density between homotopic coordinates was used as a regressor to reduce the effect of structural asymmetries on functional lateralization. Nine left- and 11 right-lateralized hubs were identified as peaks in the degree map from the graph of significantly lateralized connections. The left-lateralized hubs included regions from the default mode network (medial prefrontal cortex, posterior cingulate cortex, and temporoparietal junction) and language regions (e.g., Broca Area and Wernicke Area), whereas the right-lateralized hubs included regions from the attention control network (e.g., lateral intraparietal sulcus, anterior insula, area MT, and frontal eye fields). Left- and right-lateralized hubs formed two separable networks of mutually lateralized regions. Connections involving only left- or only right-lateralized hubs showed positive correlation across subjects, but only for connections sharing a node. Lateralization of brain connections appears to be a local rather than global property of brain networks, and our data are not consistent with a whole-brain phenotype of greater “left-brained” or greater “right-brained” network strength across individuals. Small increases in lateralization with age were seen, but no differences in gender were observed.

From Discussion

In popular reports, “left-brained” and “right-brained” have become terms associated with both personality traits and cognitive strategies, with a “left-brained” individual or cognitive style typically associated with a logical, methodical approach and “right-brained” with a more creative, fluid, and intuitive approach. Based on the brain regions we identified as hubs in the broader left-dominant and right-dominant connectivity networks, a more consistent schema might include left-dominant connections associated with language and perception of internal stimuli, and right-dominant connections associated with attention to external stimuli.

Yet our analyses suggest that an individual brain is not “left-brained” or “right-brained” as a global property, but that asymmetric lateralization is a property of individual nodes or local subnetworks, and that different aspects of the left-dominant network and right-dominant network may show relatively greater or lesser lateralization within an individual. If a connection involving one of the left hubs is strongly left-lateralized in an individual, then other connections in the left-dominant network also involving this hub may also be more strongly left lateralized, but this did not translate to a significantly generalized lateralization of the left-dominant network or right-dominant network. Similarly, if a left-dominant network connection was strongly left lateralized, this had no significant effect on the degree of lateralization within connections in the right-dominant network, except for those connections where a left-lateralized connection included a hub that was overlapping or close to a homotopic right-lateralized hub.

It is also possible that the relationship between structural lateralization and functional lateralization is more than an artifact. Brain regions with more gray matter in one hemisphere may develop lateralization of brain functions ascribed to those regions. Alternately, if a functional asymmetry develops in a brain region, it is possible that there may be hypertrophy of gray matter in that region. The extent to which structural and functional asymmetries co-evolve in development will require further study, including imaging at earlier points in development and with longitudinal imaging metrics, and whether asymmetric white matter projections [52], [53] contribute to lateralization of functional connectivity.

We observed a weak generalized trend toward greater lateralization of connectivity with age between the 20 hubs included in the analysis, but most individual connections did not show significant age-related changes in lateralization. The weak changes in lateralization with age should be interpreted with caution because the correlations included >1000 data points, so very subtle differences may be observed that are not associated with behavioral or cognitive differences. Prior reports with smaller sample sizes have reported differences in lateralization during adolescence in prefrontal cortex [54] as well as decreased structural asymmetry with age over a similar age range [55].

Similarly, we saw no differences in functional lateralization with gender. These results differ from prior studies in which significant gender differences in functional connectivity lateralization were reported [16], [17]. This may be due to differing methods between the two studies, including the use of short-range connectivity in one of the former reports and correction for structural asymmetries in this report. A prior study performing graph-theoretical analysis of resting state functional connectivity data using a predefined parcellation of the brain also found no significant effects of hemispheric asymmetry with gender, but reported that males tended to be more locally efficient in their right hemispheres and females tended to be more locally efficient in their left hemispheres [56].

It is intriguing that two hubs of both the left-lateralized and right-lateralized network are nearly homotopic. Maximal left-lateralization in Broca Area corresponds to a similar right-lateralized homotopic cluster extending to include the anterior insula in the salience network. Although both networks have bilateral homologues in the inferior frontal gyrus/anterior insular region, it is possible that the relative boundaries of Broca Homologue on the right and the frontoinsular salience region may “compete” for adjacent brain cortical function. Future studies in populations characterized for personality traits [57] or language function may be informative as to whether local connectivity differences in these regions are reflected in behavioral traits or abilities. The study is limited by the lack of behavioral data and subject ascertainment available in the subject sample. In particular, source data regarding handedness is lacking. However, none of the hubs in our left- and right- lateralized networks involve primary motor or sensory cortices and none of the lateralized connections showed significant correlation with metrics of handedness in subjects for whom data was available.

Despite the need for further study of the relationship between behavior and lateralized connectivity, we demonstrate that left- and right-lateralized networks are homogeneously stronger among a constellation of hubs in the left and right hemispheres, but that such connections do not result in a subject-specific global brain lateralization difference that favors one network over the other (i.e. left-brained or right-brained). Rather, lateralized brain networks appear to show local correlation across subjects with only weak changes from childhood into early adulthood and very small if any differences with gender.

 

 

Debunking Left Brain, Right Brain Myth / Plos Paper – Corbalis

Left Brain, Right Brain: Facts and Fantasies

Michael C. Corballis, Affiliation School of Psychology, University of Auckland, Auckland, New Zealand

Published: January 21, 2014

https://doi.org/10.1371/journal.pbio.1001767 )open access. See original for more.

Summary

Handedness and brain asymmetry are widely regarded as unique to humans, and associated with complementary functions such as a left-brain specialization for language and logic and a right-brain specialization for creativity and intuition. In fact, asymmetries are widespread among animals, and support the gradual evolution of asymmetrical functions such as language and tool use. Handedness and brain asymmetry are inborn and under partial genetic control, although the gene or genes responsible are not well established. Cognitive and emotional difficulties are sometimes associated with departures from the “norm” of right-handedness and left-brain language dominance, more often with the absence of these asymmetries than their reversal.

Evolution of Brain Asymmetries, with Implications for Language

One myth that persists even in some scientific circles is that asymmetry is uniquely human [3]. Left–right asymmetries of brain and behavior are now known to be widespread among both vertebrates and invertebrates [11], and can arise through a number of genetic, epigenetic, or neural mechanisms [12]. Many of these asymmetries parallel those in humans, or can be seen as evolutionary precursors. A strong left-hemispheric bias for action dynamics in marine mammals and in some primates and the left-hemisphere action biases in humans, perhaps including gesture, speech, and tool use, may derive from a common precursor [13]. A right-hemisphere dominance for emotion seems to be present in all primates so far investigated, suggesting an evolutionary continuity going back at least 30 to 40 million years [14]. A left-hemisphere dominance for vocalization has been shown in mice [15] and frogs [16], and may well relate to the leftward dominance for speech—although language itself is unique to humans and is not necessarily vocal, as sign languages remind us. Around two-thirds of chimpanzees are right-handed, especially in gesturing [17] and throwing [18], and also show left-sided enlargement in two cortical areas homologous to the main language areas in humans—namely, Broca’s area [19] and Wernicke’s area [20] (see Figure 1). These observations have been taken as evidence that language did not appear de novo in humans, as argued by Chomsky [21] and others, but evolved gradually through our primate lineage [22]. They have also been interpreted as evidence that language evolved not from primate calls, but from manual gestures [23][25].

Some accounts of language evolution (e.g., [25]) have focused on mirror neurons, first identified in the monkey brain in area F5 [26], a region homologous to Broca’s area in humans, but now considered part of an extensive network more widely homologous to the language network [27]. Mirror neurons are so called because they respond when the monkey performs an action, and also when they see another individual performing the same action. This “mirroring” of what the monkey sees onto what it does seems to provide a natural platform for the evolution of language, which likewise can be seen to involve a mapping of perception onto production. The motor theory of speech perception, for example, holds that we perceive speech sounds according to how we produce them, rather than through acoustic analysis [28]. Mirror neurons in monkeys also respond to the sounds of such physical actions as ripping paper or dropping a stick onto the floor, but they remain silent to animal calls [29]. This suggests an evolutionary trajectory in which mirror neurons emerged as a system for producing and understanding manual actions, but in the course of evolution became increasingly lateralized to the left brain, incorporating vocalization and gaining grammar-like complexity [30]. The left hemisphere is dominant for sign language as for spoken language [31].

Mirror neurons themselves have been victims of hyperbole and myth [32], with the neuroscientist Vilayanur Ramachandran once predicting that “mirror neurons will do for psychology what DNA did for biology” [33]. As the very name suggests, mirror neurons are often taken to be the basis of imitation, yet nonhuman primates are poor imitators. Further, the motor theory of speech perception does not account for the fact that speech can be understood by those deprived of the ability to speak, such as those with damage to Broca’s area. Even chimpanzees [34] and dogs [35] can learn to respond to simple spoken instructions, but cannot produce anything resembling human speech. An alternative is that mirror neurons are part of a system for calibrating movements to conform to perception, as a process of learning rather than direct imitation. A monkey repeatedly observes its hand movements to learn to reach accurately, and the babbling infant calibrates the production of sounds to match what she hears. Babies raised in households where sign language is used “babble” by making repetitive movements of the hands [36]. Moreover, it is this productive aspect of language, rather than the mechanisms of understanding, that shows the more pronounced bias to the left hemisphere [37].

Inborn Asymmetries

Handedness and cerebral asymmetries are detectable in the fetus. Ultrasound recording has shown that by the tenth week of gestation, the majority of fetuses move the right arm more than the left [38], and from the 15th week most suck the right thumb rather than the left [39]—an asymmetry strongly predictive of later handedness [40] (see Figure 2). In the first trimester, a majority of fetuses show a leftward enlargement of the choroid plexus [41], a structure within the ventricles known to synthesize peptides, growth factors, and cytokines that play a role in neurocortical development [42]. This asymmetry may be related to the leftward enlargement of the temporal planum (part of Wernicke’s area), evident at 31 weeks [43].

 In these prenatal brain asymmetries, around two-thirds of cases show the leftward bias. The same ratio applies to the asymmetry of the temporal planum in both infants and adults [44]. The incidence of right-handedness in the chimpanzee is also around 65–70 percent, as is a clockwise torque, in which the right hemisphere protrudes forwards and the left hemisphere rearwards, in both humans and great apes [45]. These and other asymmetries have led to the suggestion that a “default” asymmetry of around 65–70 percent, in great apes as well as humans, is inborn, with the asymmetry of human handedness and cerebral asymmetry for language increased to around 90 percent by “cultural literacy” [46].

Variations in Asymmetry

Whatever their “true” incidence, variations in handedness and cerebral asymmetry raise doubts as to the significance of the “standard” condition of right-handedness and left-cerebral specialization for language, along with other qualities associated with the left and right brains that so often feature in popular discourse. Handedness and cerebral asymmetry are not only variable, they are also imperfectly related. Some 95–99 percent of right-handed individuals are left-brained for language, but so are about 70 percent of left-handed individuals. Brain asymmetry for language may actually correlate more highly with brain asymmetry for skilled manual action, such as using tools [47],[48], which again supports the idea that language itself grew out of manual skill—perhaps initially through pantomime.

Even when the brain is at rest, brain imaging shows that there are asymmetries of activity in a number of regions. A factor analysis of these asymmetries revealed four different dimensions, each mutually uncorrelated. Only one of these dimensions corresponded to the language regions of the brain; the other three had to do with vision, internal thought, and attention [49]—vision and attention were biased toward the right hemisphere, language and internal thought to the left. This multidimensional aspect throws further doubt on the idea that cerebral asymmetry has some unitary and universal import.

Handedness, at least, is partly influenced by parental handedness, suggesting a genetic component [50], but genes can’t tell the whole story. For instance some 23 percent of monozygotic twins, who share the same genes, are of opposite handedness [51]. These so-called “mirror twins” have themselves fallen prey to a Through the Looking Glass myth; according to Martin Gardner [52], Lewis Carroll intended the twins Tweedledum and Tweedledee in that book to be enantiomers, or perfect three-dimensional mirror images in bodily form as well as in hand and brain function. Although some have argued that mirroring arises in the process of twinning itself [53],[54], large-scale studies suggest that handedness [55],[56] and cerebral asymmetry [57] in mirror twins are not subject to special mirroring effects. In the majority of twins of opposite handedness the left hemisphere is dominant for language in both twins, consistent with the finding that the majority of single-born left-handed individuals are also left-hemisphere dominant for language. In twins, as in the singly born, it is estimated that only about a quarter of the variation in handedness is due to genetic influences [56].

The manner in which handedness is inherited has been most successfully modeled by supposing that a gene or genes influence not whether the individual is right- or left-handed, but whether a bias to right-handedness will be expressed or not. In those lacking the “right shift” bias, the direction of handedness is a matter of chance; that is, left-handedness arises from the lack of a bias toward the right hand, and not from a “left-hand gene.” Such models can account reasonably well for the parental influence [58][60], and even for the relation between handedness and cerebral asymmetry if it is supposed that the same gene or genes bias the brain toward a left-sided dominance for speech [60],[61]. It now seems likely that a number of such genes are involved, but the basic insight that genes influence whether or not a given directional bias is expressed, rather than whether or not it can be reversed, remains plausible (see Box 1).

Genetic considerations aside, departures from right-handedness or left-cerebral dominance have sometimes been linked to disabilities. In the 1920s and 1930s, the American physician Samuel Torrey Orton attributed both reading disability and stuttering to a failure to establish cerebral dominance [62]. Orton’s views declined in influence, perhaps in part because he held eccentric ideas about interhemispheric reversals giving rise to left–right confusions [63], and in part because learning-theory explanations came to be preferred to neurological ones. In a recent article, Dorothy Bishop reverses Orton’s argument, suggesting that weak cerebral lateralization may itself result from impaired language learning [64]. Either way, the idea of an association between disability and failure of cerebral dominance may be due for revival, as recent studies have suggested that ambidexterity, or a lack of clear handedness or cerebral asymmetry, is indeed associated with stuttering [65] and deficits in academic skills [66], as well as mental health difficulties [67] and schizophrenia (see Box 1).

Although it may be the absence of asymmetry rather than its reversal that can be linked to problems of social or educational adjustment, left-handed individuals have often been regarded as deficient or contrarian, but this may be based more on prejudice than on the facts. Left-handers have excelled in all walks of life. They include five of the past seven US presidents, sports stars such as Rafael Nadal in tennis and Babe Ruth in baseball, and Renaissance man Leonardo da Vinci, perhaps the greatest genius of all time.

 

 

 

 


Archaic H. sapiens – H. sapiens sapiens / Testosterone

A composite image shows the facial differences between an ancient modern human (Archaic Homo Sapiens) with heavy brows and a large upper face and the more recent modern human (Homo sapiens) who has rounder features and a much less prominent brow. The prominence of these features can be directly traced to the influence (reduction) of the hormone testosterone. Photo Credit: Robert Cieri, University of Utah

2238323698Archaic vs Modern

Read more: http://www.zmescience.com/science/archaeology/civilization-testosterone-skull-04082014/#ixzz3eYaNNQr0

“It is important to note that lower testosterone is associated with tolerance and cooperation in bonobos and chimpanzees, and with less aggression in humans. It seems very plausible that as humans started to group up in larger and more interconnected settlements,

they needed to find less violent ways to sort out their problems – and in the long run, the non-violent path won.”

8bc013fe2a9e1fcb69e2384793c2502a

 

 

 

 

Cave Art in Indonesia as old as in Europe / 39,000 ya

Gee Whiz! Could it be that Archaic Humans, including H. erectus, H. neanderthalensis and H. sapiens, were visual thinkers?

just-the-facts

Eurocentric Archaeology and Anthropology take a hit:

Pleistocene cave art from Sulawesi, Indonesia

Aubert1, 2, 9, A. Brumm1, 10, 9, M. Ramli3, T. Sutikna1, 4, E. W. Saptomo4, B. Hakim5, .  Morwood11, G. D. van den Bergh1, L. Kinsley6, A. Dosseto7, 8,

Nature / Volume: 514; October 2014
Archaeologists have long been puzzled by the appearance in Europe ~40–35 thousand years (kyr) ago of a rich corpus of sophisticated artworks, including parietal art (that is, paintings, drawings and engravings on immobile rock surfaces)1, 2 and portable art (for example, carved figurines)3, 4, and the absence or scarcity of equivalent, well-dated evidence elsewhere, especially along early human migration routes in South Asia and the Far East, including Wallacea and Australia5, 6, 7, 8, where modern humans (Homo sapiens) were established by 50 kyr ago9, 10. Here, using uranium-series dating of coralloid speleothems directly associated with 12 human hand stencils and two figurative animal depictions from seven cave sites in the Maros karsts of Sulawesi, we show that rock art traditions on this Indonesian island are at least compatible in age with the oldest European art11. The earliest dated image from Maros, with a minimum age of 39.9 kyr, is now the oldest known hand stencil in the world. In addition, a painting of a babirusa (‘pig-deer’) made at least 35.4 kyr ago is among the earliest dated figurative depictions worldwide, if not the earliest one. Among the implications, it can now be demonstrated that humans were producing rock art by ~40 kyr ago at opposite ends of the Pleistocene Eurasian world.
_______________________________________________________________________________________

“Ancient cave drawings found in Indonesia show that early Europeans weren’t the only ones creating art. Known as the Sulawesi paintings, the prehistoric images were discovered some years ago inside limestone caves in Indonesia’s Maros and Pangkep regions. The drawings, which include depictions of animals and hand stencils created by spraying red pigment on to the rock face, have been analyzed using sophisticated new dating techniques and are now believed to date back at least 40,000 years. The discovery is particularly important because it shows that primitive forms of artistic expression were not exclusive to the people living in Europe at the time.”

“Cave painting and related forms of artistic expression were most likely part of the cultural traditions of the first modern humans to spread out of Africa and into Asia and Australia, long before they reached Europe,” said study co-author Adam Brunn. – See more at: http://www.unexplained-mysteries.com/news/273448/indonesia-rock-art-dates-back-40000-years#sthash.drHN9iFa.dpuf

___________________________________________________________________________________________

“Until now, we’ve always believed that cave painting was part of a suite of complex symbolic behavior that humans invented in Europe,” says archaeologist Alistair Pike of the University of Southampton in the United Kingdom. “This is actually showing that it’s highly unlikely that the origin of painting caves was in Europe.”

___________________________________________________________________________________________

“What this suggests is that this whole ability to make these things and possibly the tradition of making them is part of the cultural repertoire of the people who left Africa. ” Alison Brooks, archaeologist, George Washington University

%d bloggers like this: