Noanxiety.com Homepage

Switch to english language  Passa alla lingua italiana  
Random quote
Whenever two people meet there are really six people present. There is each man as he sees himself, each man as the other person sees him, and each man as he really is. William James

Psychiatry and the Evolution of Human Language

May 2nd, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 8 views | Print this Article

by Robert DePaolo

Abstract

This article discusses the therapeutic effect of language in an evolutionary context. The point is made that while the evolution of human language served a number of social/interactive, analytical and action planning purposes a parallel benefit accrued as well, which enhanced the capacities for self control as insulation against environmental dangers and setbacks and the ability to use internal deception to provide psychic endurance past the point of apparent helplessness.

Parameters

The origin of human language is a very complicated topic. It is one reason why some insist that man in some ways transcends the rest of the animal kingdom, (Ptolemy 2009). Indeed a creationist might refer to human language development as proof of a higher power or at the very least a super-organizational entity.

A glance at the language development of a young child provides grist for the mill. The infant begins with a vast number of loosely connected neurons in its brain which are barely able to coordinate basic motor skills for the first several years In that same time frame the child acquires exponentially a capacity to think and speak in terms of past present and future.

This is particularly interesting because the child’s brain is developing on several fronts during the first few years. Perceptual circuits, motor circuits, emotional (mid brain) wiring, the arrangement of vertical and horizontal pathways in the cerebral cortex are also developing, and must in order for cognition (which depends on such experiential content) to expand. Yet while the child does not go from taking his first steps to running a 9.5 hundred meter dash in three years his language skills broaden at an astonishingly rapid pace. His brain development is skewed in the direction of language acquisition. One can speculate about how this ties in to human evolution.

The Anatomy of Pleasure

Some aspects of human language evolution were predictable due to emergence of opportunistic human anatomy. For example, the lowering of the hyoid bone in the throat enabled our ancestors to extend vowel sounds. This would have enhanced their phonetic variety and provided several advantages. One would have been esthetic.

Due to enhanced breath control the capacity to sustain and vary vowel sounds could have produced an auditory end product (including song) attractive not only to members of the opposite sex but to all members of the social group. The reason for this assertion derives from a common theory of esthetics, stated at various times by St. Augustine, Descartes, and in more recent times by Berlyne (1960), (1975) and Zeki (2009). The theory states that any stimulus that includes both familiarity and novelty will be perceived as pleasurable and that the pleasure could take any form or serve any purpose, including music, inspiration, seduction, even humor.

The underlying reason has to do with the two aspects of the pleasure response – search and closure. Pleasure is not a free experience. It requires some degree of perceptual work Freud (1928). One must earn it by converting irresolution into resolution. Faced with a totally unfamiliar stimulus complex the perceiver will tend to withdraw from it because he cannot to “wrap his brain” around the input. By the same token encountering a completely familiar stimulus complex will also lead to avoidance because no perceptual work is required. Any experience that precludes a successful search cannot result in resolution, i.e. closure. In that context, the ideal “pleasure proportion” featuring a workable, resolvable combination of sameness and novelty would have accompanied human expression from the outset.

The First Crooners

Many believe descent of the hyoid bone first appeared with Neanderthal., though it seems rather unlikely that while Homo erectus’ hyoid position might have precluded extended breath control and vowelization his cognitive abilities could have developed without some sort of linguistic impetus (bearing in mind that it is also possible to speak without vowels; as exemplified by the click language of the !Kung people in Africa).

Whether or not hyoid descent was confined to Neanderthal and Sapiens is not certain but considering the Darwinian concept of a conversion, whereby a trait initially favoring one skill is eventually used for others, it seems likely that early versions of both species quickly learned to profit from this newfound oral capability. It is also likely that this capacity was refined over time alongside its physiological correlate, upright walking (bipedalism allows for such changes in bodily structures, including the spinal cord, digestive tract and voice box). This trait obviously passed on to our own species.

The Expansive Song List

As his phonetic breadth increased early homo sapiens’ expressive skills were more finely orchestrated. That would have enabled him to learn search and closure expressive sound patterns that were attractive to listeners. Considering man’s primate-consonant penchant for mimicry this pleasurable communicative trick would have caught on. Early man would have created a new, dual aspect of experience combining pleasure with object, action and descriptive labels that could be applied to experiences and people.

At that point the stage was set for a new version of “alpha” who gained his or her influence not by the typical primate criteria of physical prowess and dominance but by esthetic expression. Over time, as human society expanded and roles proliferated the expressive alpha tactic filtered down through the ranks. Musicians, prophets, healers, ministers, writers, poets, minstrels, actors, politicians mind healers and eventually psychotherapists were included in the neo paradigm, each able to establish themselves in the hierarchy by word as well as deed.

In terms of that scenario one could argue that the benefits derived from being an “esthetic communicator” would have been enough to set human speech on its developmental path. Unfortunately, the analytic complexity of human language makes that doubtful. Running parallel to the esthetic factor is the element of meaning. In order for the listener to respond to the esthetic value of language he must understand the concepts and vocabulary of the statements. In terms of our own sociobiological evolution that required an additional capacity within the brain to gauge the level of intelligence and overall cognitive demeanor of the listener as well as an ability to process his facial expressions, body posture and movement trends in distinguishing approval/comprehension from confusion/ frustration and adjust the message accordingly.

The Evolution of Empathy

With regard to how, in the course of evolution homo sapiens developed a capacity to read the reactions of others, one possibility comes to mind. The brain of a primate is large relative to body size but while that allows for a variety of perceptual, motor and emotionally induced behavior patterns the most basic function of the primate brain might lie in a capacity to imitate.

Contrary to the oft-stated notion that the “brain is too complex for humans to understand” (Doglas 2006) one can reasonably view the brain as a kind of copy machine. If not for that foundation long term memory would be a pipe dream and our emotional depth would be restricted to momentary fight/flight experiences. When inputs impinge on the brain it appears the brain’s neuronal patterns replicate the energy signature patterns of the outside source in isomorphic fashion (Lehar, 1999). For example the visual energy signals of a tiger are replicated as if the color and dimension of the tiger are housed in the perceiver’s brain. Over time various associations can be developed around that neural process but in the first instance the brain copies what it sees, hears and feels.

Since imitation is conceivably neuro-primal one can assume that one person interacting with another is in some sense incorporating that person into a neural scheme via a representation process, e.g “The other person, place or thing becomes me.” This notion is supported by studies showing a tendency for people to mimic unconsciously the patterns of others. For example the menstrual cycle between females living together tends toward temporal synchrony as per the so-called McClintock effect (Arden, Dye et.al. 1999). Also the emotional arousal levels of one person will tend to rise and fall in sync with those of the person with whom he is interacting (Kovalinka, Xyglatas et. al. 2011). This is one reason why social interaction can be so stressful to therapists, or for that matter anyone involved in a commiserative exchange.

However another question arises; specifically what can be inferred about how internal mental processes such as imitation, empathy and communicative esthetics play out in evolutionary terms?

The Ice Age

The climate was in an extreme cooling phase during the first human occupation of Europe and what is now called the Middle East. Living on frozen tundra had enormous ramifications for survival. In order to understand this, one must look beyond traditional paleo-anthropological concepts toward one more psycho-anthropological. This is not to minimize the challenges of the physical environment. For example with a scarcity of plants the Neanderthals and Cromagnons had to rely more on a diet of meat. That altered their behavior patterns in a quite natural way. Also, most of the game animals at the time were massive and with only Achulean axes and perhaps spears, early man had to refine his hunting and planning skills.

There was a problem inherent in that. Early man had to work extremely hard and long to carry on these and other arduous, lengthy tasks yet cold weather has a dampening affect on hormones, activation level, metabolism, mood and motivation, especially if a species migrates from the African savanna to western Europe without sufficient time for physiological adaptation (Leppauoto, Hassi 1991). In modern times we recognize that seasonal shifts from fall to winter are often accompanied by psychopathologies. While naturalists typically distinguish mammals from reptiles based on our constant body temperature and their lack thereof, mammals also experience downward physiological shifts in cold weather. Indeed the fact that human metabolism slows down considerably in winter and speeds up in spring is one reason why suicides tend to be more frequent in the cross over between the two seasons. The transition from a dormant season to an active season can be physiologically as well as emotionally overwhelming for depression-prone individuals.

In order for early humans to carry out the strenuous activities needed for survival in a cold climate (bearing in mind that there was no techno-culture to offer support) required a reliable, internally driven mechanism by which to summon and sustain activity levels when nature created an opposite effect. In simple terms it required the development and implementation of an energy regulating mechanism – the psyche.

The emergence… hesitate to say ‘evolution’… of such an ergonomic aspect of mind was probably less than completely adaptive because, as both Freud and Maxwell suggested, energy is a neutral phenomenon that can create or destroy. The new aspect of mind was more complex and vagarious than, say, upright walking or tool use and had to be channeled properly to advance the cause of survival.

An Adaptive Paradox

It is generally assumed that Neanderthal had a sense of his own mortality, as per his habit of burying the dead (Than 2013). That speaks to his psychological sophistication Yet in some ways mourning is an irrational behavior. The deceased is no longer functional in the group since his absence has no subsequent influence on survival and adaptation. Moreover since man was/is quite familiar with his own mortality there is no logical reason to feel emotionally “robbed” by the death of a family member or colleague. Yet death is a devastating experience, largely because we internalize it beyond mere logic. “How can I bring him back – through memory and legend?”…”What if I’m next to go?”…

Such contemplative, emotional largesse, combined with a cold climate could have led to feelings of helplessness, futility and depression and issued a double whammy if not for an adaptive conversion that modern clinicians will undoubtedly recognize.

The Self and the Artist

The adaptive mechanisms enabling early man to overcome these obstacles were likely in the language domain; not just regarding communication but three other forms that create a capacity for self deception, specifically; art, self-talk and self regulation. With respect to art; having an internal language capacity would have led to an ability to imitate and represent the outside world. In his cave paintings early man could depict the various animals around him, and create symbols of people within his social group. He might even come up with the idea of an alpha even higher in rank than those in his tribe, who could not only rescue him from his obscurity within the group but orchestrate all of nature – a god.

The artistic depictions at Lascaux and Altamira might have recaptured life but perhaps only due to a prior skill set. In order to find his inspiration the artist had to preplan as to subject, background, color scheme etc. In so doing he would have begun with a self talk exercise (not necessarily overt but perhaps covert) whereby his expanding frontal lobes vacuumed associations from Broca’s site in the fronto-parietal lobe so that the vast inhibitory circuits of the former could whittle down language into an inhibitory, fractional version of speech resulting in inner contemplation.

Through that guiding process size, angulation and action depictions could be altered. All aspects of life could be restructured so that victories could be imagined, dangers avoided, animals spirits created from scratch, joyful outcomes conjured up in the absence of direct experience or concrete possibilities.

The benefits of this new, internal capacity would have proliferated as a result of social influence and mimicry. It would have facilitated both artistic expression and, through its capacity to re-shape experience, also enabled our ancestors to summon and sustain arousal, motivational and behavioral activity levels in a sparse and hostile terrain.

Up to that point cruel nature had decided which creatures could carry on and which would fade into oblivion. Then man came along and was able to outwork and to an extent circumvent its whim.

From the Cave to the Couch

Any psychotherapist can see the connection of the above elements to the present day use of defense mechanisms in providing emotional equanimity during difficult times. Homo sapiens appears to have embarked on a somewhat ironic evolutionary path. In taking a ramp off the road of natural selection we developed a secondary, alternative and internal world which we can distort experience at will in accord with our needs and frailties. Darwin’s theory of natural selection holds that organisms survive through random mutations that over time are selected or rejected, based on their juxtaposition with the outside world. That rule was overturned to an extent with the emergence of homo sapiens. Indeed one could argue that we have actually been able to separate ourselves from the outside world and that human evolution offers a partial exception to the Darwinian rule. Does this mean we are transcendent, or as Darwin and Huxley surmised, merely a biological after-effect arising from the initiative of mindless nature? Such a question is highly philosophical, maybe even moot. Yet it is difficult to refute the notion that the rise of mankind was marked by an ongoing competition between nature and the imagination.

REFERENCES

Arden, M/A/ Dye, L. Walker, A. (1999) Menstrual Synchrony Awareness and Subjective Experience. Journal of Reproductive and Infant Psychology 17 (3) 255-65

Berlyne, D. E. 1960 Conflict, Arousal and Curiosity, McGraw Hill

Berlyne, D.E. (1975) Aesthetics and Psychobiology, National Arts Educational Association.

Doglas, Y. (2006) The Paradox of the Brain. Neuroscience. Sept. 15, 2006

Freud, S. (1928) Humor. International Journal of Psychoanalysis. 91-96

Konvalinka, I. Xyglatas, D. Bulbulia, J. Schpdt, U. Jeginda, E-M. Wallot, S. Van Orden, G. Roepstorff, A. (2011) Synchronized Arousal Between Performers and Related Spectators in a Fire-Walking Ritual. Proceedings of National Academy of Science. May 17, 2011 108 (20) 8514-8519

Lehar, S. (1999) Gestalt Isomorphism and the Primacy of the Subjective Perceptual Experience. Behavior and Brain Science

Leppauoto, J. Hassi, J. (1991) Human Physiological Adaptation to the Arctic Climate. Arctic. Vol. 44, No. 2 130-145

Notes; on Man’s Transcendence. Ptolemy, D. Derived from the Film: Transcendent Man, Ptolemaic Productions, Therapy Studios released 2009.

Than, K. (2013) “Ancient Ritual” A 50,000 Year Old Neanderthal Discovered in a Cave in France was Intentionally Buried. Article in National Geographic Dec. 16, 20134

Zeki, S. (2009) Statement on Neuroesthetics. Neuroesthetics. Web. Nov. 2009

Create PDF    Send article as PDF   

Mass Murder: The Psychology of Prediction and Prevention

February 14th, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 103 views | Print this Article

Mass Murder:

The Psychology of Prediction and Prevention

by Robert DePaolo

Abstract

This article discusses the dynamics of human aggression and specifically the proclivity toward killing strangers, via the mass murder scenario. The point is made that a package of traits involving perceived estrangement, pseudo-familiarity toward the victims, a nihilistic outlook (depression), obsession and ego-dysfunction combine to increase the probability of such an act.

Genes and the Ego…

If one asks a liberal democrat why people engage in mass murder he or she will typically say it has to do with the availability of guns in American society. If one asks a conservative the same question, he or she will likely say it has to do with mental illness and that the mere presence of guns does not correlate with the act of committing mass murder- anymore than watching violent movies automatically foments violent behavior. If one were to conduct a statistical correlation study of the number and/or percentage of gun owners vs. the number and percentage of people who use guns to commit mass murder the correlation would be so low as to be statistically insignificant (Peterson, 2014) (Kennedy, Skeem et. al 2012). The same would be true with regard to people diagnosed with mental illness, including psychotic disorders. In fact it takes a fairly organized mind to plan such an act, as well as an intact enough reality orientation to select targets, obtain weapons, plan on time, place etc.

Of course guns are used to kill, and some mass murderers have been presumed to have mental disorders. Yet developing a predictive model by which to identify potential mass murderers would seem to require more than a liberal or conservative argument and perhaps encompass more than gun ownership or severe mental illness. That raises a question. To wit, if neither mental illness nor gun ownership are necessary and sufficient antecedents where does one look to find a preventative model?

Genes and Detachment…

A number of biologists have surmised that imbedded in virtually all organisms is a disposition to behave altruistically toward those within the same gene pool, and/or those with whom one has familiarity on a regular basis (Thompson, Hurd et.al. 2013), (Wendell, 2013) – especially if the relationship involves cooperative, mutually beneficial interactions. On the other hand it seems that as the genetic and social ties drift off into dissimilarity and non-mutuality the potential for aggression increases. It isn’t necessarily that our genes foment hate based on the differences between organism A and organism B. It’s just that with differences (real or perceived) the protective instinct dissipates. With respect to the psychology of Homo sapiens it is left to the individual to fill in gaps. Various outcomes can arise from that sense of estrangement.

As the most social of the primates we are extremely dependent on social interaction. As exemplified by a survey conducted by the Self Help Collective, various studies on public speaking and other potentially performance-related experiences suggest humans fear ostracism more than death (2014). When children are neglected or ignored in early development they not only become un-socialized but resentful and aggressive.

In a sense the necessary ties we have with one another comprise our greatest strength; giving us hospitals, caring parents, mentors, supportive agencies, even clerics willing to hear about our sins, trials and tribulations and offer solace. However when that “social compact” (i.e. the expectation of nurturance and reciprocity) is violated it is often perceived as betrayal. That can lead to retaliation.

If detachment revenge were all there is to it, prediction and prevention would be fairly easy and rather ominously, there would be more potential mass murderers. Obviously there is more to it than that. In fact another, paradoxical and dual psychic component seems prerequisite. One part consists of a self-perceived closeness to the target group or individual because it is difficult to hate, let alone kill people who are inconsequential in one’s life. In order to develop the essential mindset enabling one to pass beyond the regulatory barriers of ego and superego the avenger must categorize his victims to a point of utter familiarity. A second part (comprising the paradox) involves the perception that the targets are alien and distressingly unrecognizable.

It is a paradigm adopted by Adolph Hitler, who first espoused clear separation between the “Arian race” and virtually all others, then assigned distinctive traits to his targets (Jews and others) by turning them into them familiar entities. Recognition – separation; these elements can set the stage for ostensibly any anti-social act, including mass murder.

Such a dynamic can be seen with what might be called estrangement murders as well. Mark David Chapman (the murderer of John Lennon) and Ted Bundy, who targeted women who resembled a woman who rejected him prior to his rampage yet were in fact complete strangers are prime examples of this. It has also typified the onset of wars throughout human history, whereby specific traits and distinctions were assigned to enemies – by race, religion, ethnicity, thus releasing the demons within our species to freely kill and maim.

The Freedom of Psychic Rigidity…

Thus far, the elements of estrangement and quasi-familiarity have been viewed as requisite psychological components. In fact there are other variables to consider. One is considered here to be a kind of “theory of everything” with respect to anti-social behavior – the obsession. In order to discuss this it might be best to begin with Freud’s concept of the ego. This notion is a bit general but can be said to correspond to the self-awareness, linguistic, planning and regulatory functions of the prefrontal cortical lobes. Functionally the ego is designed to prevent extremes. It is a psychic clearing house, an arbiter of reason invariably weighing factors such as risk, reward, self image congruence, social ramifications and emotional repercussions in the process of behavioral selection (Strahey, 1990). If it can be bypassed, all those modulating capabilities can go by the boards, making each of us capable of extreme behavior patterns. With a depleted ego man can indeed potentially devolve into a “killer ape.”

The intact ego is pan-influential, so most of us operate via multiple faculties, weaving cognition, emotion, perception, memory and behavior into socially acceptable patterns. That’s why most people with guns do not murder and why most viewers of violent movies do not go out and commit violent crimes. The ego filter is a greater deterrent to sociopathy than gun laws or artistic restrictions. It provides a pro-social tether that can rein in aggression so we can think beyond the moment. On the other hand the tether can be severed by obsession-compulsive features.

In order to override the ego’s considerable breadth and influence requires extraordinary force. An emotion-induced obsession suits that purpose very well. But it must be more than an episodic occurrence produced, for example by situational or “state” anxiety. It must be a fairly ingrained trait, created either by innate temperament, doctrinaire conversions, trauma or neurologically induced rigidity – the latter of which can produce hyperactive, perseverative tendencies in the brain, forcing it to operate on a single track to the exclusion of peripheral inputs and sensations.

Language and Tunnel Vision…

As Simon has demonstrated emotional states tend to create a narrow focus (1967). This trait is adaptive in nature but often maladaptive in society. Being pursued by a predator certainly requires a singular focus on escape. On the other hand many social situations require subtlety even in emotional circumstances. Thus the human animal must be able to modulate what is essentially an adaptive mandate. The only way for this to occur is through the self-regulatory capacities of language. The ego is ultimately a language medium. One must not only have expressive skills, but two other linguistic skills are needed to rein in destructive emotional behavior. One is a capability (honed through practice) to talk one’s self through duress. The other is a set of experiences whereby, through interactions with others, a sense of consensual reality is developed so that the individual can “think as others think” and thereby preclude extreme thoughts and feelings.

All of those skills are ego-driven as Freud suggested, thus language habits (particularly a capacity for self-reflection and regular opportunities to “check in” with others in conversation) are a critical component in deterring the urge to kill.

Depression…

Seligman wrote about depression as being typically accompanied by a sense of helplessness (1975) i.e. the perception that one’s behavior cannot resolve disparities between needs and frustrations. While depression might seem a harmless emotional state- at least in its impact on others, it is in fact quite dangerous. Running out of solutions leads to nihilism, i.e. a belief that nothing matters. That leads to a ‘nothing to lose’ mindset which can not only remove the fear of retribution but make it seem somehow attractive.

Aggression…

Thus far several variables have been discussed with respect to the psychological makeup of a mass murderer. One obvious feature – aggression – has been left out. One might assume that to engage in mass murder requires an aggressive outlook. There is one problem with that, however. People get angry all the time yet do not act out in such extreme ways. Moreover, mass murderers often do not seem to respond to grudges or momentary pique. Nor do they necessarily have a history of assaultive behavior (Hsu 2012). Indeed the act can sometimes appear random and devoid of imminent rancor toward those around him prior to the act. Thus mass murder might not be purely an act of rage. As strange as it seems the act might be in its essence more cognitive than emotional. The question is…what cognitive process would it involve?

Totem Pole Demons…

Recently, President Obama (and many others) expressed frustration over the fact that mass murder happens much more often in American society than in all other advanced societies – thus his call for stricter gun laws. He is probably wrong about that, since many nations have experienced mass murder, including Russia, England. Germany and many other advanced countries, going even further back than Jack the Ripper. In that context it makes sense to look beyond a specific nation for the solution – indeed to the very nature of our species. Homo sapiens is a primate, or at least shares many neurobehavioral traits with other primates. One such trait is a large brain, which in turn disposes all primates to be highly social. The reason is fairly obvious. Big brains enable their owners to perceive faces, behaviors, physical characteristics, vocalizations and visual inputs in more detail. As a result primates are better able to distinguish among members of the group; for example to determine which ones are dominant, submissive, sexually receptive, emotionally unsteady, intelligent, wily, deceptive and so on. As Pinker (1997) has noted some of the impetus for expansion of the human brain was to enhance social perception to meet the demands of increasingly complex human interaction,.

One of the most significant byproducts of this process in humans is something that can be referred to as the “equi-hierarchy.” In chimp and gorilla social groups there is little question of rank. The silver back runs the show in gorilla groups and the alpha male rules in chimp society. Most of this is based on size and strength. Yet the chimps in particular experience political revolutions from time to time as lower ranked males stage a coup against the alpha male. That is in part because they can see beyond size and strength. They see numbers as an advantage. They can also consider the age and physical decline of the alpha and other signs that he is “losing it” and plot against him in accord with those perceptions.

Human perception enables us to see an even greater number of variables. Size strength, intelligence, artistic ability, social notoriety, sexual prowess, financial status and so many other features are symbols carved out on the totem pole of human experience. The net result is that the human hierarchy is so fluid as to be virtually in the eyes of the beholder. One might even argue that beyond historical stimuli such as the Magna Carta, Locke’s treatise on the social contract or the Declaration of Independence, it was basic human neuro-psychology that created an inevitable drift toward democracy, particularly after the advent of the handgun, which really leveled the playing field.

The equi-hierarchical phenomenon has enormous impact on human thought which is reflected in the obsessive human need for attention, approval, fame etc. It disposes us to feel entitled to totem pole ascendency; not just as a possibility but as a basic right. One byproduct of a doctrine of equality (not just before the law – which was the sole intent of the constitutional framers – but in all aspects of life) is potential conflict; both within the individual and in his interactions with society. This dynamic can provide justification for a mass murderer. The attitude…”if society deems me equal to all others yet I fail to ascend the ranks it is the fault of society and I am no longer obliged to live by its rules” can prevail, making mass murder less an emotional act than an intellectual assessment.

In that context, President Obama might be more accurate to say such crimes are more prevalent in societies with a low emphasis on acceptance, with highly fluid versions of equality and in societies where (perhaps through media-fueled emphasis on universal fame and notoriety) citizens are unable to find satisfaction in life’s smaller conquests and achievements. By that reasoning a philosophical society, or one with religious overtones – for example via the teaching of acceptance and a capacity to defer to a higher power – might well produce fewer violent individuals than a non-philosophical or godless society.

Diagnostics and Prevention…

The ultimate question is whether the above discussion can be applied to prevention. While no system is even close to perfect in predicting human behavior there are criteria that could increase the likelihood of detection and prevention of mass murder. For example an individual might be disposed to mass violence when…

1. There has been a long period of social isolation wherein ego functions could have been depleted by lack of normalizing social interaction and where an internal language capacity honed by faith or moral teaching is absent or diminished.

2. The individual expresses a sense of threat and betrayal due to a perceived, threatening disparity between his aspirations, self perceptions and his idealized status and actual life outcomes.

3. When parents and family dynamics emphasize the “specialness” of the individual despite his or her average traits and accomplishments – which would set the stage for threat inducing cognitive dissonance.

4. When cognitive and language habits reflect the ultra-familiarity with the traits, habits and motives of potential targets with whom the individual has no familiarity.

5. The presence of mental rigidity, in the form of repeated complaints, use of phrases in the characterization of others, reliance on rituals and habits, extreme emotional resistance to change, and even rigid eye focus, i.e. lack of shifting eye movements that correspond to deliberation, the weighing of ideas and also reflect a pause capability between feeling and acting that can indicate working ego functions.

6. The presence of depressive traits, with accompanying feelings of helpless and a nothing to lose mentality. That particular trait removes to an extent the fear of getting caught and punished and skews the risk-reward ratio toward anti-social behavior.

Each of the above factors could be seen in the behaviors of various mass murderers and serial killers. Certain factors might be more pronounced than others; for example some might have ample social contact but excessive rigidity and dissonance between self image and actual status as well as a closeness-detachment fixation toward others. Just how these factors could be used for prediction scenario, either by clinicians, family members, police or the courts is hard to say. It could be a question of proportion whereby some of these factors might be non-existent while others might be pronounced.

This does at least hopefully provide a beginning model to identify individuals at risk that can be used even in non clinical settings as a mean of reducing, if not eliminating the occurrence of such acts.

REFERENCES

Freud, S. (1990) The Ego and the Id; The Standard Edition of the Compete Works of Sigmund Freud. J. Strahey (Ed.) W. W. Norton.

Hsu, J. From an Article in Tech. News Daily Dec. 14, 2012. Can Mental Screening Predict Mass Murder?

Kennedy, P. Skeem, J. Bray, B. Zvonkovic, A. Law and Human Behavior.

Peterson, J. How often and how consistently do symptoms directly precede criminal behavior among offenders with mental Illness? Article in American Psychological Association Newsletter 21, 2004.

Pinker, S. (1997) How the Mind Works. W.W. Norton

Self Help Reference: Fear of Ostracism discussion was derived from a list compiled by the Self Help Collective at self help collective.com in 2014.

Seligman, M.E.P. (1975) Helplessness: On Depression, Development and Death. San Francisco, W.H. Freeman

Simon, H. (1967) Motivational and Emotional Controls of Cognition. Psychology Review. 174 (1) 29-39.

Thompson,, G.L. Hurd, P.L., Crespi, B.J. (2013) Genes Underlying Altruism. Biology Letters

Wendell, J. (2013) Epigenetics Sheds New Light on Altruism. Genetic Literary Project

Create PDF    Send article as PDF   

On Nature…and the Nature of Cognition

January 16th, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 16 views | Print this Article

by Robert DePaolo

Abstract

This article discusses the roots of cognition and decision making. Those phenomena are assumed to originate with the first molecular interactions in semi-closed systems made possible by the advent of lipid-secured membranes create interactive feedback as opposed to the “drift” seen in open systems. This prototype is viewed as a precursor to advanced cognitive abilities, culminating in human cognition.

Whatever is meant by terms such as cognition, intelligence and memory cannot readily be described in concrete organic or neuro-functional terms because there does not seem to be any direct correlation between the wiring or interconnections within the animal brain and the capacities to use basic behavioral mechanisms such as flight, aggression, altruism, or even instinct. (Timmer 2012). Part of the problem lies in semantics; particularly since it is homo sapiens sapiens doing the classifying. We are the only scientists in nature thus our descriptions of nature are both confined to and embellished by our own values and experiences. As owners of a massive cerebral cortex we typically view functions such as altruism and long term planning as the result of big brains. Indeed several renowned thinkers in fields as diverse as psychology,anthropology, neurology have surmised that such functions are a luxuries bestowed on us through the evolutionary expansion of the frontal cortical lobes (Rowe. Bulloch et.al 2001) Others have through meticulous research concluded that limbic circuits within the human brain like the hippocampus provide memory consolidation capacities (Suzuki,Yaneke,et.al. 2004). It appears during a learning task the hippocampus becomes active when the task has been learned, analogous to a librarian who gathers the books and re-organizes them into proper catalogue sequence after the readers have gone home.

On the other hand, most animals and all plants lack a hippocampus and have no frontal lobe. For example a female cobra’s brain is miniscule compared to ours, yet once she recognizes she is pregnant. she digs nests to hide her young, obviously contemplating the possibility that predators and/or unfriendly climatic conditions might jeopardize their safety.

So many other ostensibly complex cognitive abilities are seen in this neurologically simple creature. To prepare for the birth of offspring, the female cobra must on some level be a naturalist, cognizant of the habits of predators. She must be a prognosticator, able to guess as to possible variables related to temperature and terrain. She must also be empathic enough to put herself in the shoes of a hungry predator who feeds on cobra eggs. In effect she must sense their hunger by extrapolating from her own. She must be a chronologist, capable of determining when her offspring will emerge from the eggs and altruistic enough to not only become enraged and frightened over threats to her offspring but also to feel satisfaction when her protective gestures lead to successful outcomes as her offspring make their way into the world safe and sound.

Interestingly, one can attribute all these cognitive and emotional traits to organisms even simpler neurologically than the cobra. It is even possible to extend discussion of so-called cognitive abilities to simple celled creatures because many appear to make decisions in adapting to their surroundings. For example amoeba engage in foraging behaviors tantamount to a huting strategy (Cumming, Hall et. al. 2010). Even at the level of the most primordial organisms, bacteria and viruses – for that matter mere molecules such as RNA and DNA there are decisions being made without use of brains, language or memory substrates – at least as viewed in a conventional sense.

The Search for a Cognitive Monolaw…

In discussing the nature of cognition, one can attempt to gain closure in several ways. One way is by assuming all these behavioral traits seen in brainless and meagerly encephalized organisms are instinctive, thus built into their rigid neurologies. In that sense nature can be said to provide then with the luxury of not having to use cognition or make decisions. A second way is to assume that our concept of cognition is anthropocentrically derived and a somewhat reductionist take on the nature of mind. Unfortunately, neither approach resolves very much.

The instinct argument has flaws, as was pointed out by Maslow (1954). To assume organisms are endowed fortuitously with an ability to act in adaptive ways with little or no neural mechanism by which to do so, one has to first ask about the sources of these fortuitous abilities. Did nature say “presto” – you now have an instinct that just happens to coincide with the demands of the outside world? Unlikely. Even if one invokes the Darwinian argument that various behaviors were honed by evolution over time it does not explain the structural and functional origins of the behavior to begin with. While it is possible that some organs and behavior patterns evolved to serve purposes other than ones for which they were eventually employed – for example the wings on insects and birds that favored temperature regulation before being employed in flight, this still does not address the question of why structures evolved coincident with purpose – as opposed to being random and inconsequential. It also begs the question of how many behaviors had to be tried, winnowed out and selected before the (temporary) right ones arrived through evolution. In fact that juxtaposition of the random with the purposeful is what makes natural selection seem a bit problematic (Russell 2007).

In his book How the Mind Works (1997) Steve Pinker addressed this problem.by proposing a quasi-computer model of mind. He suggested that each layer of the nervous system has built in modules, i.e. for language, perception etc. and that they arise not through random mutations but through apriori functional circuits requiring little or none of the trial and error process inherent in natural selection. He also suggested they blend smoothly with previously established brain sites both structurally and functionally – a hugely determinstic, if not fatalistic concept for a Darwinian advocate.

Even if Pinker’s well-written thesis is correct, it in nonetheless difficult to explain decision making at the anencephalic and molecular levels.

It would seem there is another way to resolve this issue. It is by assuming the existence of a transcendent law or mechanism in nature that requires no brain yet provides for and encompasses phenomena such as memory, stimulus recognition, decision making and cognition. The obvious question is where to look for such a cognitive monolaw.

The Roots of Cognition…

How can decisions be rendered, circumstances recognized, prior experiences referenced/recalled without neural mechanism by which to conduct these operations? One way to address that question is by analyzing the most basic equipment available to the first organisms. A basic requirement of a life form is the capacity to replicate via the transactions between RNA and DNA to align genes in the proper sequence. Another requirement is the alignment of amino acids coupled with the release of water for binding during this process to build proteins leading to the construction of cells and organs. Life also requires mechanisms to react to the outside world for energy replenishment and to facilitate approach and avoidance responses. Yet there is yet another, often overlooked component – a capacity to undergo flux and at the same time maintain internal stability. In other words a homeostatic, self-correcting (cybernetic) mechanism. If not for the latter, life could have come and gone many times over without lasting long enough to undergo changes in complexity.

Resilience…

In many ways the latter typifies the essence of life more than any other quality because longevity is the key to development and fitness. While replication is often cited as the sine qua non of life, and while protein synthesis is considered similarly important, crystals and other non biotic elements divide under certain chemical conditions in a process that runs parallel to replication.(Ferro, 2010) Moreover, as Shapiro has pointed out, amino acids can come together to form proteins yet not in themselves comprise anything resembling what we might call “life” (1986) Any number of pre-organic entities might have formed in the primordial soup, some providing organic essentials. Yet many undoubtedly broke apart from the onslaught of temperature shifts between day and night, oceanic tumult and/or due to the fragility of the compound itself. In that sense, for life to ensue required a holding together of a proto-organic entity so that its mechanical resilience would enable the prototype to expand, become more systemically complex (for example prokaryotic cells incorporating other components to evolve into nucleated eukaryotic cells. Thus if one seeks the foundation of biological existence it might be described in one word…”glue.”

The Origin of Stasis

A most essential glue factor in the advent and continuation of life forms was of course, carbon, a compound that is neither plentiful in the universe nor in the earth’s atmosphere but one that makes up for its lack of availability with extreme resilience. It is most easily seen in the diamond, the hardest substance on earth, which consists exclusively of carbon atoms. Yet carbon is not just firm and resilient, it is also highly chemically accommodating… or “social” if you will. It is capable of bonding with a wide variety of other molecules and provided a perfect base for not only the origin but the continuance of early life forms. Because it has these communal properties it can be said to be the most communicative of all the compounds – a central grammar in the expression of biochemical synthesis.

Yet if carbon provided the mechanical and attractive necessities for life to form, what then provided the functional properties, i.e. the reactive essentials that enabled living things to multiply, heal, move, and ultimately make decisions? To an extent the answer might lie in yet another single term…electricity. Nothing in nature. including carbon can interact with (attract or repel from) other components without the impetus of an electrical charge. Positively charged atoms repel other positively charged atoms, while they are attracted to negatively charges. (The reason probably has to do with the inherent symmetry of the universe but that is beyond the scope of this article). In any event, on some level, stability, resilience, organic communication and genetic alignment – all those things that comprise life, are ultimately rooted in attractions and repulsions. The centralizing feature of carbon is itself rooted in that process. Therefore in seeking the roots of pan-organic cognition this might be a good place to start.

Where have you gone Albert Einstein?..

Watson and Crick, discovers of the DNA molecule, can be compared to Isaac Newton with respect to the latter’s contribution to an understanding of gravity. Newton used precise mathematical terms to determine the “what” of gravity. However it was not until Einstein came along and developed his theories of relativity that we came to know the “why.” As of now there is no “why” (or how) when it comes to an understanding of how life began and more particularly how organisms (indeed molecules such as messenger RNA) are able to make decisions. Some discussion of electricity might be helpful in that respect.

The Roots of Communication…

Communication can be discussed in other than anthropocentric terms, without compromising the nature of human communication. In other words there can be said to be a substrate or monolaw of all communication in nature that encompasses, and set the stage for human language. In its essence communication is really nothing more than an interaction of two components with a systematic (repeatable, predictable) behavioral outcome., i.e. a nonrandom interaction. When a positively charged atom interacts with a negatively charged atom, a galvanization occurs. That is a form of communication.

There are several types of communication. For convenience sake three can be discussed. One type is unilateral – like a breeze pushing an old newspaper along the street. In this instance the newspaper does not have much impact on the breeze, other than perhaps some minor degree of interference from friction. Another type of communication is reciprocal and is exemplified by a protozoan moving back and forth in response to temperature and light differentials.

Communication can also be multi-factorial, whereby complex channels within a system interact with one another to produce fluctuations without loss of systemic integrity. In each instance the level of decision making is different. In the unilateral model (the breeze and the newspaper) the decision is so basic as to be inconsequential. The paper drifts off and does not turn back on the breeze to cause a readjustment. In a two way, reciprocal model the system is modifiable and can be defined as proto-cognitive because the interaction does not end with the first stimulus. In a sense the advent of feedback produces a communicative imperative – seen in the to and fro reactions of the protozoan. At that level, and as a result of feedback, decisions can be said to exist on every organic level.

One does not need a brain or even a primordial nerve bundle to exercise cognitive decision making. One only need electrically-fostered closed, reciprocal interactions. In fact, as Cenik, Cenik et. al have demonstrated with respect to the origin of life the mystical, heretofore unexplained ability of RNA to “send messages” to DNA in creating protein structures in the body might be the result of a simple electrico-chemical reciprocity within a closed system (actually semi-closed is a more apt description) where feedback forces mechanisms to interact in orderly, systemic (and redundant) fashion. The charge originates from a nucleic acid compound) which conveniently contains a highly combustible sugar and energy catalyst known as ribose. A message is conveyed because the system is contained rather than unilateral inputs drifting off into the ether In effect, reciprocity is trapped within the membranes. Boundaries enable feedback to occur. A similar process applies to the famous Miller-Urey experiments where charges impinging on chemical compounds helped produce amino acids (1959).

Membranes are crucial. The availability of lipids in the earth’s earliest environment was necessary since they can form a sheath to both insulate cells and organisms from the outside milieu yet enable nutrients to pass through its border so that organisms can extract energy from the sun and other sources. Semi-permeability is a crucial necessity for life sustenance.

From the Simple to the Complex…

In terms of the above discussion, the most essential difference between cognition at the most basic level and within the realm of human thought might lie in the difference between a simple and complex feedback systems.

Complex Cognition…

In some ways complexity is how life overcomes the law of entropy which holds that all systems in nature will proceed toward decay. The more diverse yet internally regulated the variables in any system the less likely it is that decay will ensue. Since we all die life does eventually proceed toward decay but only after staging an enduring battle against entropy in the face of constant changes in metabolism, maturation, injury and general wear and tear. Organisms endure and can delay entropy due to what H. Ross Ashby called a cybernetic process. Here he was referring to a self correcting system in which higher order rules govern the ebb and flow of internal variations. This applies to brain and soma because while the mind recognizes dangers, for example fleeing from or avoiding unpleasant experiences, so does the immune system and all other bodily structures and functions. Indeed one could argue that a cognitive process exists on every level of organicity and in its essence has little to do with brain or mind.

Utra-stability…

In that regard a related concept proposed by Ashby was that of ultra-stability. This involves a system where local events can fluctuate but in being tethered to a larger system ultimately re-synthesize in adherence to the rules of the larger system. As confusing as this might seem, we can use the game of chess to create a comparison. Each move by an opponent leads to a counter move. One player might see a direct line between his opponent’s rook and his queen and choose to move the queen to evade to avoid the problem of exposing the king. However in contemplating the move the player notices that his knight is within striking range of the opponent’s rook and that taking out the rook would also save the queen and king. During all this the main “code” of the game (to capture his king and protect yours) remains the same but within that rubric a number of sub-plans are orchestrated.

There is a difference between cognition per se and a game of chess. There are also similarities. The similarity owes to the fact that both operate by multiple sub-plans and actions revolving around a central theme. The point to playing chess is to win. The point to garden variety cognition is neural stability. In that context, Cognition is a dual process, serving two functions; one being neurobiological ultra-stability, the other involving the experiential correlates of that closure.

Arousal, Quiessence and the Sheffield Study…

The above reference to ultra-stability need not be confused with the Freudian notion of tension reduction. For example with regard to the restoration of stability – a mechanism seen in all organisms – it is not a question of arousal or tension being reduced by either neural or experiential closure. It simply refers to restoration of a steady state, which, as conveyed in the classical study by Sheffield could take the form of tension reduction or tension induction (1966).

This view of cognition can theoretically be applied to a wide variety of decisions rendered in nature, and perhaps explain phenomena like messenger RNA and Dawkins’ notion of the “selfish gene. (1976).” While in many ways this might make human thought seem less unique, perhaps even be discouraging to the anthropocentrist it provides a possible clue as to how nature initially paved the way for human intelligence. While speculative, it does offer a possible connection between the hunting amoeba, the maternal behaviors of the cobra and Newton’s calculations on gravity.

REFERENCES

Cenik,K, Cenik,ES, Byeon, G.W., Grubert,F. Candille, SI, Spacek, D. Alsallakh, B. Tilgner, H. Araya, C.L. Tang, H. Ricci, E. Snyder, M.P. (2015) Integrative Analysis of RNA Translation and Protein Levels Reveals Distinct Regulatory Variation Across Humans. Genome Research 25; 1610-21

Cumming, P. Hall, J.R. Junhwan, J. Weaver, A. Quaranta,V. (2010) Human Cells Exhibit Foraging Behavior Like Amoebae and Bacteria. Public Library of Science Journal

Dawkins, R. (1976) The Selfish Gene. Best Books

Ferro, S. (Article from Science retrieved Feb.1, 2013) Physicists Create Crystals That Are Nearly Alive

Maslow, A. H. (1954) Instinct Theory Re-examined. Motivation and Personality. New York, Harper & Row

Miller, S.L. Urey, H. (1959) Organic Compound Synthesis on the Primitive Earth. Science, 130 (3370) 528-529

Pinker, S. (1997) How the Mind Works. Norton

Rowe, A.D. Bullock, PR. Polkey, CE. Morris, RG. (2001) Theory of Mind; Impairmentsand their Relationship to Executive Functioning Following Frontal Lobe Excision. Brain 124 (pt. 3) 600-616

Russell, J. (2007) A Challenge to Richard Dawkins. Science Index, Vedic Science

Shapiro, R. Origins: A Skeptic’s Guide to the Creaton of Life on Earth Summit Books.

Sheffield, F.D. (1966) A Drive Induction Theory of Reinforcement. In R.N. Haber (Ed.) Current Research and Theory in Motivation. NY Holt. Rinehart & Winston

Suzuki, W. Yanike, N. Wirth, S. (Article Retrieved May 13, 2004 from Science Daily). Scientists Show Hippocampus’ Role in Long Term Memory.

Timmer, J. (Article Retrieved Oct.8, 2012 in Science and Organism Exploration) Organism without a Brain Creates External Memories for Navigation.

PDF Creator    Send article as PDF   

Essay on Morality: The Resurrection of Sigmund Freud

June 11th, 2015 by Robert DePaolo | Posted in Psychology | No Comments » | 31 views | Print this Article

By Robert DePaolo

Abstract

This article discusses the history and essential aspects of human morality, the declining adherence to classical religious doctrines and the need for a future set of moral principles to provide guidance and restraints on human behavior.

A Moral Framework…

Describing just what is meant by morality can be a difficult task; not just due to the variations among cultures both now and over the course of time, but also because morals are, despite being oft-expressed in concrete written formats, i.e. commandments, constitutions and statutes, are inherently fluid. The fluidity is largely due to the human tendency toward proportional thinking. Just as the neuronal configurations in the human brain can differentiate between “bigger”, “smaller” and between the formula for a radius vs. diameter of a circle, so is it prone to moral calculations.. A man who uses foul language is given a lighter penance than one who commits adultery. A man who kills with deliberation is punished more severely than one who kills on impulse. Indeed one who kills in self defense will typically receive no punishment at all. That of course makes the circumstances of the act as relevant morally as the act itself and that in turn makes morality as much a function of cognitive deliberation and examination as one of hard and fast rules.

There is nothing unusual about that. The human mind is disposed to think, perceive and emote in contextual ways, which is why Thomas Aquinas approached faith and morality in logical and philosophical as well as religious terms. (Torrell 2005). In founding the philo-religious school of thought known as Thomism Aquinas integrated Aristotelian logic with traditional Christian principles.

There are of course a variety of moral theories, most with workable premises that have guided us through history, succeeding (more or less) at keeping the worst of human behavior under wraps. One was espoused by our second third president.

A Second Declaration…

In his letters to Doctor Benjamin Rush in 1803 Thomas Jefferson sought to flush the most essential aspects of morality by discussing it in non-religious terms. (He was a great admirer of Jesus’s teachings, less so of the accounts of miracles and the virgin birth). To him basic morality entailed, not strict adherence to rules but a capacity to feel the plight of the one being harmed or victimized. Jefferson felt anyone who could extend his thought process into the vicarious far reaches of mind could be described as moral – even if an atheist by conviction. In simpler terms he felt that the core requirement of morality is empathy (Sanford 1987)

Jefferson’s moral concept is certainly feasible. Most researchers and clinicians in the field of psychiatry and psychology equate morality with a capacity to feel for the other person and the assumption in this line of thinking is that having a capacity to extend beyond the self – both intellectually and emotionally, would lead to an adequate degree of self-restraint and a prosocial outlook. However Jefferson’s view on morality was not original.

Sidartha Gautama was also an advocate of “empathy theory.” He actually broadened the empathic criteria and parameters of moral reciprocity to include all of nature. Jesus of Nazareth was another significant advocate of the empathic model of morality, to a point where his tenets (which were arguably intended to re-invigorate classical Judaic ideas purveyed by Isaiah and the earlier prophets rather than establish a new religion) offered inclusive contrast to the chauvinistic, cold pragmatism of Roman influence. Many of his sermons address the plight of the other person; for example in ( )his decision to forgive and redirect rather than condemn the prostitute and his capacity to feel for the Roman soldier by healing his slave.

Empathy theory remains on solid footing in modern times. It has been demonstrated that antisocial behavior patterns are more likely to occur in individuals who lack this particular capacity (deWied, Gaudena et. al 2005), Decety, Skelly et. al. (2013), (Baron-Cohen 2011). However it seems less than a complete explanation in describing essential morality. For example, one can have moral behavior patterns instilled in them by ritual, by social pressures and by the threat of punishment, despite a lack of empathy. In such instances morality can be viewed not as an altruistic response but as an operant behavior designed to avoid punishment and maintain ritualistic dependency. Thus while empathy is part of the moral picture, there might be more involved.

Biomorality…

Another moral theory has its roots in human evolution; specifically the tendency derived from our primate ancestry toward adhering to social hierarchies and seek dominance (Boehm, 1982). The fact that many human endeavors have a competitive tinge (sports, politics, the quest for social rank etc.) would seem to suggest humans do indeed tend to act in accord with a kind of alpha politics. Moreover aspirations for dominance can be said to run parallel to immoral behavior in that dominance equates with aggression and leads to diminution in the value of victims. That makes them seem less human, thus easier to humiliate, control, abuse or even murder. Adolf Hitler’s attitude and actions toward, on one hand, the ostensible superiority of the Arian race and on the other the inferiority of Jews most certainly provided grist for the mill in his heinous, dominance-fueled rampage through history.

Thus far the elements of empathy, social rank and proportion have been discussed. While they seem to be separate components, they can be melded together as a unified concept. That is because, much of what we call proportionate thinking involves a cognitive process whereby one assumes the position of the actor and/or his victim…e.g. “if it were me.” Yet even that synthesis leaves room for added factors.

Traditional Models…

As any student of ethics can attest a number of moral theories have been proffered over time. Philosopher Immanuel Kant conjured up the idea of a categorical imperative. This concept featured a “golden rule” topography by assuming morality involve a sense of duty, use of reasoning to discern good from evil and reliance on the consensus to determine moral parameters. Kant believed morality must be universally accepted to be deemed “moral” (Ellington 1993). While some have called this into question, it does have merit; for example most societies have formed consensual prohibitions against murder, theft and adultery.

The moral relativists would tend to disagree with Kant, however, in espousing that morals can only be defined by the times. So too might advocates of Utilitarian theory, which holds that ostensibly immoral acts can be deemed moral if they ultimately serve a higher purpose – a concept quite similar to Thoreau’s notion of civil disobedience and interestingly enough to Machiavelli’s end-means paradigm. Meanwhile the ancient Greeks, most notably Aristotle, conjured up a virtue-based model of morality, which revolved around the idea that true moral behavior was actually pleasurable, i.e. dually reinforcing. The idea was that being virtuous would lead to a state of contentment. This was an interesting combination of hedonism and values that waned in the wake of sterner Judeo-Christian principles arising in the Middle Ages but it too has a degree of merit. It does feel good to behave virtuously; for example emotional gratification often results from helping others.

Still another moral theory takes the form of ethical egoism. This bears similarities to the moral-hedonic paradigm of the Greeks, since it holds that true morality consists of a combination of self and other gratification, i.e. “I do for you and you do for me.” It is a doctrine of reciprocity which is reminiscent of Locke’s idea of a social contract and with the underpinnings of democratic government.

Of course, there is also the Divine Command theory of morality which as the name suggests, emanates from the words and laws bestowed on man from God. In its purest form this system deems humility the highest virtue and pride the primary vice. It is obedience-based, as seen in the ordering of the ten commandments, which begins with the mandate of worshipping one and only one god. This system has sustained human morality for millennia and despite critics, has led to more benevolent cultural and moral outcomes by far than to negative ones. The problem with this system is that society evolves and God is not readily available to address those changes in terms of his original commands. For example, the idea of going forth to multiply was necessary in helping the chosen people build a formidable society, army and culture. During the Genesis epoch overpopulation (which could intensify competition and deplete resource allotments) wasn’t a problem. Thus absent deistic intervention, it is difficult to apply certain ancient principles to modern times except by the reasoning of man. That can be fallible, despite the existence of church leaders.

Synthesis…

A key to finding moral essentials is to look for overriding principles that encompass all these theoretical concepts. One place to start is by finding a champion of morality, albeit a somewhat unexpected one, who goes by the name of Sigmund Freud. The question becomes, how can the writings of Freud help in the development of a moral code that is consistent with (but does not rely on) religious content, that incorporates all the traits of mankind, including his primal, cognitive, social and emotional proclivities? The reason for engaging in this unified-moral-theory exercise is twofold. First, given the decline in religious affiliations (less so in the USA than in Europe) along with the competing influence of the empirical philosophy, mankind will still need a moral format in order to continue to exist regardless of the source of this model. Given the prevalence of war, enslavement, deception and inter-tribal hostility over the course of history, we simply need a regulatory process to rein in our worst instincts. While Homo sapiens has proved capable of inspiring acts of altruism as well, the amount of destruction resulting from our bad side arguably outweighs the good acts emanating from our collective acts of kindness. For example no act of kindness or altruistic endeavor has had the impact of Hitler’s, Stalin’s or Chairman Mao’s purges.

Morality Meets Human Nature…

Among so-called moral theorists, only Freud acknowledged the existence of the “id”. He did so, not as a negative component of mind but as a necessary and functional energy-enhancing mechanism that provides us with the fuel for motivation and creativity. In so doing Freud captured both the good and the bad Homo sapiens with his concept of an integrative psyche, whereby the functional juxtaposition of psychic faculties, rather than mere prohibition of immoral impulses comprises the essence of a pragmatic morality. That integrative process would enable us to convert potential immorality into a creative process. In other words bad impulses can be viewed simply as raw materials for either social destruction or social enhancement (1923).

Sociopathy…

Immorality is typically and understandably equated with an anti-social mindset. Lack of empathy, social alienation, frustration-aggression sequences, narcissism – these are all catchwords associated with immoral behavior patterns. The underlying assumption behind each is that the immoral person lacks an emotional connection to his fellow man, that alienation from society enables him to detach socially to such an extent as to objectify others and view them as simply a means of addressing his own needs. It is a reasonable assumption, except that it does not get at the root of what could be termed “moral pathology.” To find that root it might help to begin some discussion of Freud’s ultimate moral regulator – the ego.

A Moral Gestalt…

To understand moral pathology requires some understanding of ego function. While the colloquial concept of “ego” implies single-minded conceit, its actual purpose in clinical terms is the opposite of that. The ego is the psychic voice of reason. It assimilates impulses – knowing full well (if I can personify a bit) that the animal side of Homo sapiens exists and like energy itself can never be obliterated, only converted to productive pursuits. So the ego looks in on impulse, weighs its impact, looks out at the mores of society and the schemata of the individual and it decides how, when or whether to express or channel the id impulse. Similarly it takes another glance, this time at the superego –the psychic faculty that is equally impulsive, but uses its single-mindedness to inhibit behavior, sanction thoughts and feelings. The ego assimilate both to facilitate appropriate expression. In so doing it prevents “psychic implosion” (used here as an alternative term to Freud’s neurosis).

Absent an ego the human psyche would be forced to alternate between self-serving behaviors and extreme blockage. Thus madness would result in each of us if not for its modulations.
What then does the ego actually do that allows one to think of it as a regulator of morality? To answer that question it is necessary to define what is implied here about the essential nature of psychopathology.
For purposes of discussion it might help to define in it simple terms, to wit, pathology equates with compulsion. Any person who cannot see both forest and trees in any situation involving a moral issue –or perhaps any issue, will tend toward immoral behavior. Stalin was single-minded in his persecution of Soviet dissidents. Hitler was obsessively consumed with hate for Jews, gypsies, homosexuals and anyone that did not fit into his single-minded (anthropologically absurd) notion of an “Arian.” All dictators, all psychopaths are absorbed with the compulsion, and are either incapable of or indisposed to integrative thinking.

Non-integrative, obsessive thinking can therefore be viewed as a psychic virus that infects the mind of the psychopath. It might not always be manifest as mass murder or financial swindling but it must be a pervasive cognitive/emotional factor to produce immoral tendencies.

Educators like Vygotsy, refer to a similar process in discussing a learning prerequisite known as metacognition (Braten 1991). This is simply a more intellectual version of the same figure/ground mechanism where more than one element is processed at a time and a big picture emerges to create what modern educators call “contextual learning.” In a sense ego and metacognition ostensibly derive from one and the same process and both allow us to conceptualize moral and cognitive learning under one rubric (Kohlberg 1981).

This model has both social and neurological relevance. For example a high percentage of incarcerated individuals have been diagnosed with attention or learning problems and feature impulsive behavior patterns. That trend can result from neural arousal spikes (typical of ADHD individuals) that create such immediate emotional thrust as to preclude the kind of integrative cognition needed to enact the ego mechanism. In that context, the age-old question of whether hardened, ego deficient criminals can be truly rehabilitated comes into play and more neurologically, does their brain activation occur so rapidly as to preclude access to competing thoughts – making neuro-chemical and physiological remediation as important to rehabilitation as time served and job training? Unfortunately the verdict is still out on that. However that doesn’t prevent society and its teaching agents (parents, educators, clerics) from utilizing measures to prevent the development of sociopathy.

Before discussing prevention, however, there is another practical question to be addressed. One track minded people often create chaos but their drive is often instrumental in cultural progress. Thus the obsession with conquest by Alexander and the Romans led to advancements in both western and eastern cultures. The drive to dominate the oil industry by John D. Rockefeller made him appear to be crass and greedy but also produced enormous advancements in the industrialization of America. What do we make of such extreme but apparently necessary mindsets in moral terms?

One answer can be found in Freud’s notion of id-channeling. Drive in itself is not immoral. The proportion at which it inflicts either harm or benefit to others does have moral implications and that has implications on how others view the driven individual. How shall we react to the movers and shakers, those who choose to buy not one but two yachts, ignore family in pursuit of personal goals and dreams? Do we use our own ego faculties to see both the figure and ground of such personalities; accepting their vices because of the advantages their energies ultimately provide? Do we adhere to tenets that might ostensibly be held by both Freud and Jesus that in some instances tolerance and forgiveness are necessary components of morality?

A Functional Morality…

If Freud’s model of kind can be utilized as a kind of moral teaching tool, the question arises as to how it could be used. First, a bit of relief for the faithful. As a model it would not contradict religious teaching. As discussed above the teachings of Jesus, Buddha, Vishnu and Yahweh were arguably integrative. Referencing them, in concert with the imparting of concrete figure/ground cognitive/emotional moral lessons would be perfectly workable. Indeed most of the other moral theories could be assimilated into this paradigm as well. The weighing of all factors, the coupling of self and other in each deliberation, the analysis of past, present and future in decision making – all aspects of inclusive deliberation could be incorporated into this paradigm. In effect, such a didactic approach could also be used to rein in the frustration, competition-fueled anti-social patterns in a free society like the USA. It would merely involve telling youth: You do not live in a free society. You live in a contractual society. You provide for government, they in turn provide for you. You help your neighbor and he will in turn help you. You are a biological individual but also conversely part of a social system, part of a gestalt. That is the foundation of the state and of the basic moral code among people. Since the word “freedom” is an abstraction in any case, and not terribly functional in proscribing moral attitudes or behaviors, it just might work. The skeptic will no doubt ask: Are you imposing the use of psychotherapy on global society or perhaps indoctrinating youth into a disputed psychoanalytic model that even many modern clinicians reject? The criticism would be valid and to be perfectly honest, while such a system (teaching ego development) might be effective, it is hard to say.

References

Baron-Cohen, S. (2011) Zero Degrees of Empathy: A New Theory of Human Cruelty. Penguin. U.K.

Boehm, C. (1982) The Evolutionary Development of Morality as an Effect of Dominance Behaviors and Conflict Interference. Journal of Social and Biological Sciences. 5: 413-422

Braten, I. (1991) Vygotsky as Precursor to Metacognitive Theory: II Vygotsky as Metacognitivist. Scandinavian Journal of Educational Research. Vol. 35, No. 4 pp. 305-320

Decety, J. Skelly, L.R. Kiehl, K.A. (2013) Brain response to empathy-eliciting scenarios in incarcerated individuals with psychopathology. JAMA Psychiatry, 7 (6) 638-645

DeWied, M. Gaudena, P. Matthys, W. (2005) Empathy in Boys with Disruptive Behavior Disorders. Journal of Child Psychology, Psychiatry and Allied Disciplines 46 (8) 867-880

Freud,S. (1923) The Ego and the Id.Translation Joan Riviere. Hogarth Press and the Institute of Psychoanalysis.

Jefferson letter reference: (1803) The Writings of Thomas Jefferson:

Letter to Doctor Benjamin Rush Albert Ellery Bergh (Ed.

Kohlberg, L (1981) Essays on Moral Development. Vol. 1 The Philosophy of Moral Development. San Francisco, CA Harper & Row.

Sanford, C (1987) The Religious Life of Thomas Jefferson. Charlotte, NC Press

Torrell, J-P, (2005) St. Thomas Aquinas: The Person and the Man, CUA Press

PDF Creator    Send article as PDF   

Essay: Psychiatry and Human Experience

March 12th, 2015 by Robert DePaolo | Posted in Psychology | No Comments » | 15 views | Print this Article

By Robert DePaolo

Abstract

This article discusses a modification of Freud’s tripartite theory of mind, in the form of a dualistic model in which only two essential tasks face the psyche; the search and accommodation of periodic conflict/uncertainty and the periodic resolution of uncertainty or conflict reduction. This model is described in the context of human evolution.

Freudian Functionalism..

Sigmund Freud’s structural model of mind (its topography as opposed to psychoanalysis per se) continues to exert an influence on clinical practice, (Dvorsky, 2013). While the clinical zeitgeist has shifted in recent times toward cognitive and behavioral methodologies, it seems the basic analytic premise of three aspects of mind either competing or meshing to produce both mental health and mental breakdown seems still relevant. Like a sturdy, massive tree trunk giving rise to various branch networks, the notion of a reality-modulating ego, a conscience-driven superego and an energy-fomenting, primal id is arguably so entrenched in even modern concepts of personality as to constitute a virtual tautology.

To an extent the triadic model of mind has been supported by neurological studies, (Miller & Katz 1989), some of which typically describe the human brain as part primal (the id function residing in the limbic system), part social-perceptual (the ego function residing in the cortical regions) and part ethical moral and self-conscious (the superego function residing in the pre- frontal lobes).

Despite the apparent soundness of this model, there is another way to look at the personality, which is consistent with both every day (and perhaps even unconscious) experience and also the neural functions on the human brain. It is a much simpler model consisting of only two components and it derives from the brain’s evolutionary development.

Advent of a noise-busting machine…

In the course of evolution, (evidently around 200,000 years ago) it appears the human brain rapidly added substantial neural mass (Fu, Giavalisco et. al (2011). it is possible, given the evidence of our remote ancestor’s cultural evolution that there was not a strict correlation between brain mass and cultural advancement – certainly not on the same progressive timetables seen today. In effect it seems the size of the human brain initially rolled over the domain of functional necessity.

Even today, it is arguable that our brains are too massive. This is more than a philosophical point. It is well known that after childbirth, the brain sheds significant amounts of tissue at various stages of development; a process typically referred to as pruning. Interestingly, the reduction in brain cells actually leads to more sophisticated cognitive abilities. The reasons why are two-fold. First, in child development neural circuits develop vastly increased interconnections. Vertical neural hookups (established in early development) come first, followed later by cross-grid connections. The latter correlate with language development so that categorical thought sets the stage for integrative thinking. That enables the brain to think economically and holistically. With the onset of what Piaget called operational cognition, a child no longer has to store separate memories in pathways, devoted for example to “apple”, “orange” and “banana.” He can now retrieve the memories of each by referencing the concept of “fruit.” Such linguistic bridge building is typically followed by pruning, both because having multiple access to memories precludes the need for sheer neural volume and because retaining such volume would create noise interference with regard to memory retrieval.

Noise…

The above discussion is not intended to imply that as it passes through the several post- pruning stages the brain is absolved of noise interference. To the contrary, since the brain remains extremely large relative to body size there is still too much volume to process experience in simple terms. Add to that Lashley’s principle of mass action (Rutherford, Francher (2012) and it becomes clear that with each experience so much brain activation occurs that a super-sifting mechanism becomes necessary to find our thoughts and summon our best behavior. That somewhat deliberate, sifting cognitive style is what makes humans so…er… deliberative. We can (indeed must) pause, delay, contemplate, appraise and engage in many of the secondary thought-appraisal mechanisms described by Lazarus (1984) as a direct function of brain volume and interconnectivity.

That means human experience is characterized by fairly constant noise. Our percepts are not as clear and concrete as smaller brained creatures. In fact it is possible that what we call instinct (and often demean as beneath the parameters of human experience) is actually the neural norm; in the case of humans, not absent but rather eclipsed and camouflaged by competing, interfering inputs that rework instincts into more complex, multiply influenced cognitions and behaviors. In that instance it is conceivable- though highly speculative to think of learning as being merely the modification and enhancement of instinct. If that rings true then it is possible to think of the psyche as, not a triadic mechanism but as a dual process caught inexorably on a continuum between noise and resolution.

The evolution of the psyche…

The evolution of all organic systems carries with it an adaptive mandate. While many anatomical mutations are probably inconsequential with regard to survival, such changes are tested in one way or another by nature. In the case of the human brain that was most certainly true. After all, the brain regulates cognition, emotion, vegetative functions and movements, all of which must be adroitly orchestrated for us to function efficiently. When brain mass reaches a certain point and noise interference creates a bottleneck on experience the human psyche must adapt to that. In some instances this occurs by cancellation, in others by compensation, in still others by using it to our advantage.

One way to accommodate a massive brain that modifies, blurs and embellished experience, while at the same time being able to perceive and react to the outside world with enough efficiency to survive is to develop a bi-modal mind; one part of which accommodates systemic noise by seeking out conflict, inconsistency, negativity and other irresolute aspects of the inner and outer environments, the other by resolving those uncertainties.

For this type of mind to operate would require equal capacity and proneness to seek questions and answers alternately via on ongoing, nearly constant flow of experience. To do otherwise would render us dysfunctional. For example if the human brain was wired/skewed toward seeking resolution, noise would lead to an overwhelming state of unresolved arousal – perhaps leading to what Pavlov called protective inhibition. It would be forced to shut down. By the same token, a brain attracted to noise or uncertainty would become mired in confusion, and unable to select efficient behaviors and thoughts. Thus the sheer size of the human brain might in a sense render it too vast for common experience,. Because it would be internally as well as externally driven it would have to adapt to itself and the outside environment in an inverse Darwinian scenario. It would have to adopt an “on the balls of its feet” basal status so that an obligatory search for noise could insulate it homeostatically against potential novelty-induced overload while the alternate search for noise/conflict reduction could provide temporary relief when uncertainty presented itself.

In that context, in order for the psyche to function optimally it would have to be dually oriented toward a sequential conflict-induction, conflict-reduction process. The net psychological effect would be for the human mind to sense states of discomfort both when there is a dearth of uncertainty (a mental state of entropy) and also when conflict/ uncertainty reaches a high threshold. Like Heisenberg’s uncertainty principle in physics, whereby the motion and location of particles, and the shifting wave/particle duality seem both separate and integrated, the mind could be viewed as caught somewhere between resolution and noise, with no capacity for a permanent emotional state.

The Libido…

One interesting aspect of Freud’s triadic model of mind is that it entails an energy source. Since all bio-information systems require that, so must this bimodal model of mind. The question to ask is, where does the energy source come from? Freud viewed the energy source as emanating from the primal id. Yet, taken to its logical endpoint, that argument raises questions. For example why should it be that a primal aspect of mind “allots” energy to the rational components of mind? Does not the process of ego-fostered rational cognition require acetylcholine and norepinephrine to conduct impulses? Isn’t all brain-psychic activity dependent on sodium/potassium differentials across the neuron membranes? To say the primal component of the psyche is the source of energy, would seem to skew the process of mentalism in a way that does not coincide with basic brain physiology.

On the other hand the dual model of mind can address that question in a bio-consonant way. If the source of energy is arousal – as it must be, and if it can be shown that arousal is some function of neural noise, i.e. a state of uncertainty, then the source of the libido will have been ostensibly found.

Studies on the correlation between perceptual and cognitive conflict and brain arousal are fairly well documented (Berlyne 1960) (Jepma, Verdonschot et. al 201 ), so the argument is at least plausible. Beyond that the dual model is more consistent with the nature of energy in metabolic terms. For example it is known that the human body operates by an alternating sequence of anabolic (energy build-up) and catabolic (energy usage or break down) mechanisms. Since mind and body are both physical systems the dual model mind would seem to be more in line with the body’s metabolism.

The Ego…

With regard to the above contentions, what are we to say about the ego – the structure responsible for the advent and propagation of human culture itself?

One way to incorporate an ego function into duality theory is to view it not as a referee in a bout between impulse and conscience but as an arbiter doling out (and advocating for) just the right proportion between uncertainty and closure. In that context its main task – leading to mental health and overall social/intrapersonal adaptation – would be to ensure that the person is neither too assured or confused, neither too certain nor uncertain. A healthy, functioning ego would foment energy in the psyche by first looking out at the real world. In hyper-certain, experientially stagnant times it would seek out some degree of conflict. Conversely, in hypo-certain times it would shift gears and seek resolutions. Its version of consciousness, i.e. a higher order meta-cognitive state, would entail awareness of the fact that both conflict and resolution are interdependent. As information theory so convincingly attests, there can be no information without a prior state of uncertainty.

By that line of reasoning the logical endpoint of a mentally healthy life – to the extent that it can be defined beyond subjectivity, would be one typified by a sense of constant growth and a dual search for never-ending arrangements of challenges, conflicts and subsequent resolutions. Existentially this would take the form of trying new things, taking resolvable risks, never presuming there is chronic happiness or permanent closure, instead acknowledging that all emotional states are temporary points on a continuum; as conjured up by the brain via its natural proclivities.

Parallels…

In order replace the triadic model with a dual model something else would require subjecting it to certain tests; one of which could be whether it agrees with other, well accepted clinical ideas?
The idea of manageable conflict does coincide with Seligman’s dual notions of learned helplessness and hopefulness (1972). Indeed his idea that behavioral inoculation, i.e. exposing persons to challenges they can regularly overcome can build a resilient personality runs along the same lines. Jung’s dual concepts of growth and stagnation are also similar in nature, as is Erikson’s theory of personality with its bimodal, conflicting stages of development (1959). Even the concrete ideas conveyed in Skinnerian behaviorism leads to similar conclusions. For example, Skinner demonstrated that the variable schedules of reinforcement (particularly the variable ratio schedule) tend to provide the most enduring response patterns. While his findings were outlined in mathematical terms, the organisms he conditioned were subject to periods of uncertainty, which apparently enhanced their sense of hopefulness, their response energies and their persistence during these experiments.

Clinical Implications…

Another test of the bi-modal psyche has to do with clinical application. For example, how would a clinician use this model in diagnosis and treatment? Like the Freudian model, which gave rise to ego-therapy, rational and cognitive-behavior therapies, this model could be used in a variety of ways. All, like the C-B and rational models would probably require some degree of philosophy in the curative mix. Ideas about seeking out new ventures, inducing moderate conflict (what Selye referred to as eustress (1983), about avoiding stagnation, developing conflict resolution skills, would be grist for the therapeutic mill. Meanwhile realizing that happiness is episodic, rather than a psychological resting place, would magnify its importance. The clinical philosophy would propose that noise will recur, and that to avert depression, compulsivity and anxiety would require a preventive life style, whereby the client not only addresses immediate stressors but is coached to pursue a life style of reasonable risk, conflict tolerance and a sense that experiences like growth, achievement, success are momentary lulls on continuum that must and will shift back, simply because nature designed us in that way.

REFERENCES

Dvorsky, G. (2013) Why Freud Still Matters, When he was Wrong About Almost Everything. Article of Daily Explainer 8/7/13

Erikson, E. H. (1959) Identity and the Life Cycle. New York. International Universities Press

Freud, S. (1933) New Introductory Lectures on Psychoanalysis. Penguin Press – Freudian Library

Fu, X. Giavalisco. P. Liu. X. Catchpole, G. Fu< N. Ning, ZB, Guo, S, Yan, Z. Somel, M. Paabo, S. Zeng, R. Wilmitzer, L. Khaitovich, P. (2011) Rapid Metabolic Evolution in Human Prefrontal Cortex. Proceedings of the National Academy of Sciences, April 2011: 108, (15) 6181-6186

Jepma, M. Verdonschot, R.G. Fu, X. van Steenbergen, H. Rombouts, S.A. & Nieuwenhuis, S. (2012) Neural Mechanisms Underlying The Induction and Relief of Perceptual Curiosity. Frontiers in Behavioral Neuroscience. 6:5

Lazarus, R. & Folkman, S. (1984) Stress, Appraisal and Coping. New York, Springer Publishing Co.

Miller, N. Katz. J. (1989) The Neurological Legacy of Psychoanalysis; Freud as a Neurologist. Comprehensive Psychiatry. Vol. 30 (2) 128-134

Rutherford, R. & Francher, A. (2012) Pioneers of Psychology: A History 4th Edition. New York W.W. Norton

Seligman, M.E.P. (1972) Learned Helplessness. Annual Review of Medicine. 623 (1) 407-412

Selye, H. (1983) The Stress Concept: Past, Present and Future. In Cooper C.L. Stress Research for the Eighties. New York, NY John Wiley & Sons pp. 1-20

Create PDF    Send article as PDF   

A PROTOCOL FOR SCHOOL PHOBIA/SOCIAL ANXIETY

November 25th, 2014 by Robert DePaolo | Posted in Psychology | No Comments » | 69 views | Print this Article

By Robert DePaolo

Abstract

Methods for treating school phobia typically involve systematic desensitization and/or cognitive therapy. The former purports to undo (i.e. counter-condition) the association between anxiety reactions and the stimuli and/or circumstances that provoke them. The latter purports to change the structure of schemata and override anxiety by changing the quasi-logic responsible for provoking and sustaining the phobia. While both methods can be effective, the following treatment suggestions, which incorporate CBT, SD and assertive therapy approaches, adds another factor to the therapeutic mix – the element of self-talk regulation.

The types of social-emotional disorders seen in and outside of school settings seem related increasingly to students’ incapacity for self-regulation (Gross 1998), (Mennin 2004). This skill – referred to variously as metacognition, self-control, conscience and executive functioning is quintessentially important in almost all aspects of the school experience. Once anchored down, students can more easily attend, memorize, modulate emotions and profit from peer interactions. Conversely, with deficiencies in this area a wide variety of negative outcomes tend to crop up.

Dealing with the problem in schools would be easier if one could define in concise terms what self-regulation really means. In psychological terms this is a somewhat Byzantine endeavor – witness the various characterizations mentioned above. In neuro-psychological terms it is a bit easier to do. It is known that the frontal lobes of the brain – which unfortunately for schools and society in general do not fully mature until around age 25 – provide the self-regulatory function. But how is this accomplished?

The frontal lobes are curious structures because they are not devoted to any sensory or motor function. In fact they are a fairly new evolutionary byproduct of brain expansion branching off the parietal lobe which gives us language, fine motor control (including orchestration of mouth, tongue, fingers and hands which are coincidentally responsible for the advent and expansion of human culture). As the parietal lobe moves forward into the frontal area it is met by vast inhibitory circuits that parse and refine its pathways (Sakagami, Pan et. al 2006). The end result is that speech and motor functions become whittled down to fractional versions of language and speech. That process enables us not only to talk implicitly to ourselves but to listen covertly to ourselves, because even covert auditory attention in governed by the prefrontal cortex (Benedict, Shucard et al (2002). It also enables us to manipulate the environment covertly and in effect rehearse, reflect and predict events and outcomes. It is interesting that despite having no specific function – as seen in the classic Phineas Gage head injury episode (MacMillan (2000), the frontal lobes have more connections to other brain sites than any other (Lacruz, Gracia-Seone et al 2007). Thus they are both general and highly influential –the perfect format for an oversight circuit capable of converting external into internal experience.

Some students are less developed in these functions. While they might have normal speech and fine motor proficiency, they are less adept (developed) in the area of fractionated motor and speech functions. In simpler terms they do not, cannot talk and listen to themselves covertly in working their way through task work, social situations and as a means by which to modulate emotional reactions. In effect they have limited internal access.

This is especially important with regard to emotional dynamics, because many types of phobia seem to be related to skill deficits in the self-regulation domain (Rapee & Heimberg 1997). For that reason it would seem a therapeutic/behavior management model that incorporates self-talk, self-regulation into a treatment approach might be effective. The following suggestions incorporate anxiety-reducing tactics such as relaxation training and assertive training as well as self-regulation. The model is not based on research, rather is proposed as a speculative model (subject to the creative revisions by school counselors and psychologists) that just might prove effective in dealing with school phobia.

PRINCIPLES

Anxiety can be defined as an unmanageable arousal level of global, uncontrollable proportions. The main problems with it are uncertainty (not having a behavior by which to control it) and over-generalization (not being able to compartmentalize arousal so as to parse and minimize its impact).

The method here includes three components: Relaxation/Desensitization,
Assertiveness and Self-talk regulation.

Strategies; Anxiety in specific or general situations or can be controlled behaviorally by reversing the factors mentioned above, for example by…
1. Whittling arousal down to narrower influence through self-talk and self-control labeling skills to categorize, parse and ameliorate its effect.
2. Employing relaxation exercises to reduce arousal prior to engaging the anxiety-laden situation
3. Expression of assertive behaviors to enable a semi-aggressive response to drown out the inhibitory effects associated with anxiety in those circumstances.

METHOD

The first step involves discussion of student’s commitment and motivation.

The second step involves identification of anxiety-provoking circumstances and completion of a rating scale (perhaps 1-10, from least to most fearful )

The third step involves learning and practicing relaxation exercises, self-talk strategies and assertive behaviors (scripts to use) that are comfortable to the student and which will be used in real situations. This is done in counseling office for several sessions.

The fourth step involves use of imagination in anxiety-laden situations in states of relaxation and while engaging in an assertive behavior (in office)

The fifth step involves the student will be asked to
a. Use a brief relaxation exercise before in entering the anxiety-provoking situation.
b. Use two self-talk scripts while in the situation…
The first involved first acknowledging the anxiety (“Oh boy, this is hard”… etc etc

The second involves compartmentalizing/parsing using the self-talk response (“It’s just a damn classroom; it won’t kill me”

The third involves expression of the assertive response in the anxiety-provoking situation – possibly a firm greeting to another student or a witty remark to override inhibition/anxiety.

These steps would be carried out gradually, the actual gradation will depend on the person’s learning curve

MEASUREMENT

An ongoing fear rating scale could be filled out weekly at first to see if anxiety has diminished and to what extent – the feedback will help the student recognize his mastery over the fears as well as provide an assessment of progress.

REFERENCES

Benedict, R. Shucard, D.W., Santa Maria, M.P. Shucard, J. Abara, J.P. Coad,
M., Wack, D. Sawusch, J. Lockwood, A. (2002) Covert Auditory Attention Generates Activation in the Anterior Rostral.Dorsal Cingulate Cortex. Journal of Cognitive Neuroscience Vol 14, (4) 637-645

Gross, .J.J. (1998) The Emerging Field of Emotional Regulation: An Integrative Review. Review of Generall Psychology. 2; 217-299

Lacruz, ME, Garcia-Seoane, J.J. Valentin, A. Selway, R. Alarcon, G. (2007) Frontal and Temporal Functional Connections of the Living Brain. European Journal of Neuroscience. Sept. 28 (5) 1357-70

MacMillan, M. (2000) An Odd Kind of Fame; Stories of Phineas Gage. MIT Press pp. 116-119

Mennin, D.S. (2004) Emotional Regulation Treatment for Generalized Anxiety Disorder. Clinical Psychology and Psychotherapy: 11, 17-29

Rappe, R.M. Heimberg, R.G. (1997) A Cognitive-Behavioral Model of Anxiety in Social Phobias. Behavioral Research and Therapy. 35, 741-756

Sakagami, M, Pan, X, Utll, B. (2006) Behavioral Inhibition and Prefrontal Cortex in Decision Making; Neurobiology of Decision Making. Journal of Neural Networks. Vol 19 (8) 1255-1265

PDF Printer    Send article as PDF