Switch to english language  Passa alla lingua italiana  
Random quote
The hottest places in hell are reserved for those who, in a time of moral crisis, remain neutral. Dante Alighieri

Essay: Music, Neurology and Human Experience

March 4th, 2017 by Robert DePaolo | Posted in Psychology | No Comments » | 10 views | Print this Article

by Robert DePaolo

This article discusses the neuropsychological underpinnings of rhythm and the musical experience with respect to the influence on emotional states, neural quiessence and what is referred to here as the optimal experience. The point is made that the combination of rhythm, tempo, dissonance-resolution and rapid closure comprises an ideal state of mind and that this phenomenon can tell us a great deal about the general functions of mind and brain.

A Neuro-Duality…

Although it is somewhat unorthodox with respect to current ways of thinking I would like to arbitrarily divide neuro-experience into two parts. One is discrete. It consists of inputs separated from one another by time and association. The other (which comprises the main theme of this article) is defined as “continuous experience.” It consists of input packages linked together by a rhythmic nexus, i.e. a binding aspect of the input that, while maintaining a discrete “quantum” separation nonetheless impinges on the central nervous system as “one” (wave-like) stimulus
(Wade, 1996)

Ergonomic Bridges…

There are many differences between the discrete and continuous input experiences but one of the more critical is the amount of work required to process either configuration. While the brain obviously categorizes and discriminates among stimuli in discrete fashion it appears to operate ultimately in a holistic manner by linking inputs together for the sake of mnemonic/ retrieval convenience and energy conservation. That is because holism holds entropy at bay (Prideaux 2000). In other words while our sensory parsing capabilities enable us to differentiate one thing from another the drift in the brain (in storage and alignment) is toward a grouping process where links among discrete aspects of experience are assembled into concepts. This facilitates memory, both through generalized associative ties and in terms of the spatial topography of the brain. By narrowing input volume through broad, multi-referential links we are able to store information redundantly, i.e. in various sites around the brain. The capacity to convert high detail/ high volume, discrete item by item inputs to broader/low volume (streamlined) conceptual groupings enables us to maintain memories and learning sets despite loss of neural tissue from aging and injury (Jibu, Pribram et. al 1996).

The way in which this works is fairly simple. In a language context the brain converts items such as “orange”, “apple” and “pear” or “elephants, “lions” and “tigers”(which comprise three bits of information respectively) into the single information bits….”fruit” or” animals.” While such a process might seem to risk loss of detail memory (especially when dealing with more complex items) the opposite is true. By cross referencing “fruit” and “animals” detailed, derivative associations are brought up via a reciprocal mechanism.. It is essentially a holographic process, whereby partial stimuli contain all the information in the category and can therefore be accessed economically.

In that context Pribram proposed a holographic theory of memory (1999). This theory purports to explain why memory is stored and conserved despite loss of neural tissue and other mitigating factors. A holograph is a mechanism in which wholes can be retrieved from partial, surface stimuli by use of a coding process that triggers holistic (multi-dimensional) retrieval. One advantage of a holograph is that it needn’t store all the information on one plane or storage container because the code in itself converts parts into wholes. (In Pribram’s model the code is transmitted by brain wave activity; specifically through a process he referred to as a slow potential microstructure).

For purposes of discussion, all versions of conceptual processing will be referred to as an “experiential bridge”: the benefit of which lies in the capacity to treat myriad inputs and experiences as “one.” While, as discussed above this mechanism has a language component it also applies to non-linguistic phenomena. Indeed it appears to be more broadly related to basic neurology – specifically as pertains to the rhythms and cadence of neural activity.

Rhythm and Ergonomics…

Since experiential bridging transcends language the question could be raised as to how non-linguistic creatures conceptualize their experiences. One possibility might lie in the tendency of any creature’s brain to conserve energy by a “governing cadence” (which is a rhythmic, wave-like code). While discrete bits of information are easy to perceive and associate the separate processing of experience is less energy efficient because of the prolonged alternation of excitatory and inhibitory networks involved in one-at-a-time stimulus sequencing. Put more simply, brains work harder with details than with more global stimulus groupings. That is perhaps why brains gravitate toward holistic experience.

One fairly clear confirmation of that trend lies in the fact that the brain operates according to a mass action process. (Lashley 1929). In any learning or experiential instance it summons mass arousal prior to sifting through to the details in what amounts to an information-friendly whittling down from noise to codification, that is, from uncertainty to resolution. The advantage that provides is to globalize inputs enhances behavioral snd cognitive options. Despite the temporary logjam involved in sifting through circuits this actually improves response efficiency by streamlining the otherwise grueling work involved in stop and start excitatory/inhibitory activities.

There is an obvious caveat to this. Piecemeal (quantum) discrete chunking of stimuli is necessary with respect to the initial perception of inputs. As discussed above, stimuli that are continuous and blurred together in a wave-like holistic stream create the possibility that specific details will be lost. For continuous (wave-like) experience to work it must have an interpretive element or code by which to ameliorate noise, and despite the overlap of input flow, preserve the information content. Just how continuous experience could co-exist with associative specificity entails further discussion.

Song, Dance and Information Content…

A blend of stimuli can provide superior information content if it is regulated by a cadence/code providing an overriding grammar to the stimulus flow. Despite its common use as a language/cognitive element, “grammar” – that is, grammatical organization is a broader aspect of experience and is always some function of rhythm. But what is rhythm? Many thinkers have addressed this question, perhaps the most interesting theses revolving around the related experiences of rhythm and pleasure (Blood, Zatorre 2001)..

There are a number of reasons why rhythm is a pleasurable experience. Two of which pertain to its capacity to both minimize aversiveness and maximize the enhancement of experience. Reading the names in a phone book would, over time lead to yawns and pleas to end the exercise. The same result would ensue from reading a newspaper column verbatim or a garden variety term paper by a high school student. But when cadence is added to the exercise; for instance in a song lyric or poem (both of which include cadence, stanza, verse and other codes that provide both a broad concept and an internal structure), then the listener will more likely be attentive and inspired.

What happens to the mind under the above circumstances? For one thing the blend of mass brain arousal accompanied by a coherent cadence makes for a more comfortable assimilation of both complexity and recognition The work required to convert one to the other is just enough to produce pleasurable feelings derived from ultimately reaching closure – not enough to produce mental fatigue; bearing in mind that the attainment of pleasure requires some degree of work. The neurophysiological consonance resulting from closure, which in turn occurs in the transition from uncertainty/novelty to resolution has been linked to a state of pleasure (Berlyne 1960).

The activation of pleasure-perception phenomena adds emotion to the mix. It facilitates more fervent and efficient attention to the details of the song/poem or other encoded experience. By superposing positive affect on cognition such rhythmic experiences have lasting effects on memory and behavior. In other words; feel better-learn better. (One anecdotal proof of this can be seen in the enormous legacy of the ancient biblical prophets whose words have endured for millennia despite a dearth of written documents and. for that matter, literacy among the people of the time).

Ancient Messaging…

Once upon a time man had little or no technology. Yet he still had to communicate, learn and remember. Papyrus and stone tablets were available mostly to the privileged and only through painstaking craftsmanship. Despite that lessons were learned and imparted among all strata and those lessons have endured to this day in the form of religious and historical texts, psalms, and poems. One reason for this informational permanence is that the lessons were typically sung rather than spoken. The biblical prophets have been described as whirling dervishes, jumping about, dancing and singing hypnotically for maximum communicative impact (2017) – and perhaps because they enjoyed the jolt themselves.

Modern Messaging…

In modern times the centrality and efficiency of rhythmic experiential bridging is rather ignored, indeed as separated from every day life as a concert is from a classroom lecture. Yet, because our brains are what they are the process somehow persists. To maximize fan interest and investment virtually all sports contests feature musical preludes and interludes after nearly every time out and even during the action to override the (literal) nuts and bolts of mere competition. Political conventions, graduation exercises, weddings and commercials all use music and cadence to teach, inspire and influence. It is obviously a time-honored method in teaching young children.

We humans seem to recognize instinctively that our most vivid experiences occur when wave-like inputs mimic the natural, neuropsychological traits of the human brain in what amounts to an optimal experience featuring consonance between input and mind. Whether or not we could more effectively) utilize this mechanism (in an “old school” sort of way) to improve human learning, creativity and cultural development is another question. Will homo sapiens ever re-aspire to a mode of expression through which neuro-musical visions move us toward a new/old mode of expression grounded in happiness rather the drip, drip, drip of exhaustive detail, endless self-examination and guilt? It’s anybody’s guess.


Berlyne,D.E. (1960) Conflict, Arousal and Curiosity. McGraw Hill
Blood, A.J. Zatorre,R.J. (2001) Intense pleasurable responses in music correlate with activity in brain regions implicated in reward and emotion. Proceedings of the National Academy of Science,98 (20) 11818-11823

Jibu, M. Pribram, K H. Yasue, K. (1996) From conscious experience to memory storage and retrieval: the role of qantum brain dynamics and boson condensation of evanescent photons. International Journal of Modern Physics 10: 1735-1754

Lashley,K.S.(1929) Brain mechanisms and Intelligence: A quantitative study of injuries to the brain. Chicago, University of Chicago Press

Wade,J. (1996) Changes of Mind: A Holographic Theory of the Evolution of Consciousness. Albany State University of New York Press

Prideaux, J. (2000) Comparison between K. Pribram’s Holographic Brain Theory and more conventional models of neural computation. Internet Article; Virginia Commonwealth University

Pribram. K.H. (1999) Quantum Holography: It is relevant to brain function? Information Science 115 (1-4) 97-102

Prophets reference, Internet Article: Teaching Messianic Jewish Biblical Truths – Worship and Dance; Jewel Jewel Television Series, March 2017

PDF Creator    Send article as PDF   

Evolution, Organic Feedback and Neo-Lamarckism: Is Natural Selection a Sufficient Explanation?

November 22nd, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 12 views | Print this Article

by Robert DePaolo



This article offers alternatives to Darwin’s theory of natural selection as a complete explanation of evolution. Two other possibilities are discussed, including a revision of Lamarckism. The argument is offered that while there is probably not a direct, imminent mutative response to environmental changes as in Lamarck’s model, a feedback mechanism might exist within the interactions of DNA, mRNA, protein synthesis vis a vis environmental shifts that induce organic stress and lead to a state of genomic agitation, (i.e. negative feedback) that creates uncertainty in biochemical assembly and over time increases the probability of trait mutations.

Topsy-Turvey Naturalism

Sometimes it seems the idea of randomness as applied to adaptation is hard for even Darwinian adherents to conceptualize, possibly because it is non-deterministic and perhaps a bit alien to scientists. Darwinian advocates often allude to…”organisms developing large molars to accommodate a vegetarian diet”, or perhaps…”a coat color mutation creating a blend with tall grass to facilitate stealth and predation.” The fact that natural selection could lead to random mutations that could be advantageous is not in question here. Undoubtedly that process occurs at some (trial and error) level of probability. Yet it seems incomplete.

Take tooth development. The notion that there is an implicit congruence between a vegetarian diet and having small canines and large molars implies several things that are sequentially confusing. One pertains to the question of whether certain creatures began eating plants, then evolved teeth more suitable to grinding or whether the mutation to smaller canines and enlarged molars came first, leading to a “decision” by the creature to shift to a plant diet. Since most creatures have brains, evolution can arguably never be separate from cognition – a point that will be revisited in the discussion on Neo-Lamarckism.

It raises other questions. For example; why would a creature change its dietary preference? If it never ate plants before the mutation; its taste buds, perceptual attractions, dietary tract, metabolism and behavioral instincts would have to change along with its teeth. If it did eat plants prior to the dental mutation why the need for large molars and small canines? If the creature had subsisted as a herbivore prior to the mutation what benefit would the tooth restructuring provide – bearing in mind that many primates with large canines feed primarily on plants, which after all are plants. With regard to the efficiency of tooth structure and plant eating, meat eating involves chunking morsels down as opposed to chewing but one could do that with lettuce as easily as with a leg of zebra.

Another problem with Darwin’s theory is his notion of sexual selection. In choosing mates females could certainly select some traits over others but that selection process might run counter to evolutionary change; not just because the meshing of paternal and maternal genes tends to stabilize the gene pool but because females typically select males with traits that represent the current state of the species. For example human females value language capacities in men, which happen to be unique to our species. Some female birds select males who sing well or those with elaborate plumage that exemplifies the best status quo traits of the species. In that sense sexual selection would seem to mitigate against evolutionary change. The females in any given species cannot be prescient enough to select mates with unusual traits that may or may nor prove advantageous down the road. They do not mate by chance but by purpose, which would tend to skew the evolutionary process toward species norms.

Another problem lies in the fact that the biological world and all its creatures operate by a homeostatic process. When there is a disturbance in any physiological system, the tendency is for cybernetic (corrective) responses to restore stasis. The fact that there are regulatory genes in the genome and also communicative interactions between RNA and DNA regarding the appropriate synthesis of proteins suggests homeostasis operates at the biochemical level. That suggests a holistic process in gene alignment rather than peculiarities cropping up here and there that would be subject to nature’s scrutiny.

Finally, there is the problem of chronology. Whenever an opponent of natural selection argues that it is impossible for the order and complexity seen in all organisms to occur over time the counter argument is that life has been around for well over a billion years and that humans cannot conceive of such extended time spans – that they can’t impose anthropocentric judgments on evolutionary probabilities.

In some ways that argument is legitimate, except for one thing. Evolution does not happen “over time.” It often occurs in short bursts – in other words it is punctuated. For most of those billion or so years, organisms did not change much at all. Then in dramatic spurts they did. Unless one assumes some cause and effect relationship between the relatively sudden change in the environment and an increased rate of mutation.


Based on such snags in the theory of natural selection, there are alternative ideas that would not refute Darwin’s theory but might augment, or even surpass it as a model. One, proposed by this author is progressive encoding (DePaolo 2007). To explain this requires a look back at the origin of life. Many have grappled with the question of how life began on earth, especially since even bacteria are incredibly complex, it is clear such systemic entities did not arise from spontaneous generation (Wald 1954) or from a floating DNA molecule that out-competed other macro molecules that did not replicate (Muller 1966) or from some sort of proto-biotic protein that could build somatic structures and also reproduce (Fox 1977). One question raised by Shapiro (2006) who wrote a very insightful book on the subject of origins was how a system comprising life – with its self sustaining and reproductive capacities cropped up in the first place.

Bits of Existence

Progressive encoding is based on Information Theory concepts. Without going too far off the subject, this theory can be whittled down to two main concepts. One element is noise, which is a super blend of elements without distinction, thus without information content. In simple terms, if every component in a whole is exactly the same there can be no identities within the structure thus no capacity to separate one item from another). Nor could any one element or the “super blended whole” have any capacity for communication because information transmission requires at least two separate signals.

The other component is information, or a code, which does feature distinctions that operate outside the overall “blend” and while interacting with it, are not incorporated into the whole. As an illustration; consider a room full of people who all look exactly alike, have the same exact name, walk and talk exactly alike. In such an instance there would be no personhood. On the other hand if one person “broke loose from the pack” found a way to talk and look differently and assigned himself a separate name those distinctions would comprise three bits of information. One bit (or distinction) being unique language, a second bit pertaining to a distinct appearance, a third resulting from his having a unique name.

Information can be quantified. The general formula is that each resolution of noise or uncertainty (undoing of the blend) comprises one bit of information (Shannon, Weaver 1949), (Pierce 1961), (McGliece 2002). One interesting aspect in this scenario is that despite being differentiated from the others the “information man” would still have to interact with the others. If he remained isolated he would eventually become himself a monotonous system with no distinction or information content. He would regress toward a “noisy” blend. Thus the influence of information dynamics in nature is sequential and mandatory. A component can break out of a uniform system, become informed/encoded but to remain solvent it must interact with other components or systems that in turn are distinct from it, lest it lapse into a insular state of noise, i.e. entropy.

Yet that process has a major snag. Any system that expands and becomes more interactively variable will run the risk of chaos. In order to maintain the integrity of the overall system there must be interactive rules, that is, a governor. In the hypothetical social example discussed above the rules might revolve around a common language, and perhaps rules on social probity. In the body the rule is homeostasis i.e. an oversight process that recognizes errors in the overall system and can summon substrate organs to make readjustments regarding body temperature, blood flow, caloric count, metabolism, cellular maintenance etc.

From People to Molecules

The proto-biotic molecules on earth poised potentially to ratchet into life forms were faced with a “noise” problem. Such molecules no doubt cropped up periodically in some sort of assimilable form, but due to disruptive lightning storms, water flow and extreme shifts in temperature between day and night they would break up and re-blend with the surrounding milieu. In order for life to evolve into a system with anchor point structures and functions required a mechanism by which it could separate from its surroundings; specifically a semi-permeable membrane. Membranes insulated cells from the tumult of the outer world and enabled them to function independently. The advent of the first semi-permeable membrane did not allow for complete independence but rather created a capacity to remain separate yet communicate with the environment so it could absorb and discard energy and obtain and act upon information about the outside world. Membranes consist of lipids, which are fats. Fats provide an ideal insulation against water and other agents. With their incorporation into the cell structure came a new process of evolution, characterized not just by natural selection but by an increasing tendency toward organic insulation. First, came membranes to insulate the cell proper. Then with the advent of eukaryotic cells came nuclei and other organelles to provide further layering between the inner and outer world. And as organisms proceeded to become larger and more complex came further insulation in the form of organ structures with specialized functions, such the alimentary tract, lungs, (or gills), muscles, hearts and brains.

The advent of separate and distinct organs led to an increase in bio-information content. An important aspect of information content (even in a biological context) is redundancy. By way of explanation, the main purpose of all biological systems is cellular integrity. Cells need nutrition (including oxygen) to survive. Having lungs to inspire, a heart and vessels to pump and transmit, muscle to burn sugars and signal the need for replenishment creates a very efficient division of labor that insulates the organism against a failure in any one system and increases the odds on cellular survival. In short, the number of layers (distinctions/bits) in the body provides both increases insulation from the environment and a parallel increase in bio-information content. In that context the amount of information contained in any given organism delineates its separation quotient from the environment and correlates highly with adaptation and survival.

Snags in Complexity

According to Progressive Encoding Theory life did not merely emerge and evolve as a retroactive means of adapting to its environment but also as a means of avoiding environmental influence through the complexity of organic structural and functional insulation – in other words via enhanced information content. The progressive encoding process provided organic stability, complexity, inter-organic communication/cooperation and also produced a template by which organisms could continue to get larger, more internally complex and poly stable.

While speculative, this concept could be used to explain why mammals became homeothermic and why humans developed a brain capable of imaginative cognition. Being able to interact covertly through imagination, anticipatory thought and planning might have been just another step on the progressive encoding sequence that insulated us yet further against the influence of the outside world. This might have been a continuation of the rudimentary noise reduction process that created the initial separation of cell from environment and it might help explain why human brain expansion enabled us to create the separate, non-experiential and insular worlds of art science, empathic morality and politics. In other words the same mechanism that made life’s onset possible was also responsible for the paintings on the Sistine Chapel.

Another Alternative

Even if progressive encoding can be viewed as a co-causal process in organic evolution it is probably not the only complementary factor. This author is fond (in an ironic sort of way) of statements by dyed in the wool scientists that either personify basic biological phenomena or state them in deterministic language. For example Richard Dawkins (1976) and Carl Sagan (1980) both alluded to the idea of the “selfish gene”; the idea being that mere macromolecules are capable of instructing organisms on how to behave and think; all for purposes of maintaining the gene pool. The dictatorial qualities they assign to genes is interesting on many levels. First it implies that an entity without mind is controlling entities with minds (begging the question of what brains are for in the first place). Second, it suggests subjects like morality, social cooperation, sexual interest, even the use of deceptive behavior are driven by molecules; the rest of our bodies oblivious enactors of plans drawn up in the primordial soup several billion years ago. Perhaps we haven’t changed much after all.

At face value the selfish gene concept might seem dubious. Then again haven’t biologists discovered the highly communicative interactions between mRNA, DNA and protein synthesis? Is it not the case that chemicals correct errors, set up complex chemical pairings on the double helix, perform editing functions via RNA interruptions and tell protein composites how and where to line up in gestation? All these well documented functions do exist. Turns out many of the decisions we consider cognitive occur in the smallest of contexts. That leads to discussion on a newly emerging concept of evolution based on the initially refuted theory of Jean Baptiste Lamarck.

The Origin of Theory

Jean Baptiste Lamarck was one of several early thinkers on the subject of evolution. His model, often referred to as “soft inheritance” or “use/disuse” theory held that organisms evolved traits in response to environmental pressures. Unlike Darwin he saw organic change as more purposeful than purely accidental. The key element in his theory was not that individual organisms could change in the face of environmental pressures but that such changes could be passed on to subsequent generations. One flaw in his mode was the notion that evolution inevitably proceeds to order – that there is an implicit (almost Platonic) drift toward organic perfection. That was a bit too anthropocentric for most scientists. The purposeful adaptation vs. random change distinction is important because it is well known that any given creature can alter its morphology in response to environmental changes. A thin person living in cold climates can “fatten up’ by eating certain foods and by becoming less active – two mechanisms for sustaining energy reserves. The real question is whether such changes can be carried over to new generations. In other words, will the person’s newfound girth and metabolic shift show up in the body type or metabolism of his offspring?

Early research seemed to disprove Lamarck’s theory; the most notable being Weismann’s study (1889) which showed that cats whose tails were severed did not over several generations produce tailless offspring. However in hindsight this and other studies seem suspect. For example severing tails in an artificial, experimental context had nothing to do with extant environmental pressures occurring over time. The experiment did not include environmental pressures mitigating the need for tails – as for example if long tails over time made the cat more susceptible to predation, i.e. easier to catch.

Weismann’s refutation prevailed in any event and natural selection remains a mainstay of evolution theory. On the other hand science never stands still and recent research has led to a modification of use/disuse theory in a new model supported by the idea of epigenesis.

Softer Inheritance

Based in part on questions regarding natural selection by Gauthier ( 1990) an others, Neo-Lamarckian theories have arisen, with roots in several areas of study. All of these models adhere to the notion that traits acquired in light of environmental changes can be passed on to subsequent generations. All of these are refutations of the germ plasm theory, which derives from natural selection and holds that the somatic experiences of one individual or generation will not register with the DNA and consequently are not heritable. Studies in the field of trans-generational epigenesis have shown that cellular and physiological traits that do not correlate with changes in the DNA sequence are heritable by daughter cells. (Jablonka, Lamb (1995), (Jablonka 2006). That would seem to challenge the idea of an exclusive connection between genes and mutations. Their study showed that altering the diet of mice with dietary supplements led to changes in expression of the Agouti gene, which is involved in color, weight and cancer proneness. Thus it seems generational changes can transfer to the traits of offspring even without changes in the genetic code.

Other studies have offered challenges to the natural selection model. For instance the functions of stem cells as macro-generators of more specific cells raises questions about the direct link between specific genes and inherited traits. (Skinner 2015) In this instance changes were determined by stem cell generativity without a corresponding change in the DNA of specific traits. Offering still another challenge to natural selection, are “prion” studies which have shown that proteins can catalytically convert and reduce a protein’s activity and that micro-RNA can cause a delay or disruption in the communication between messenger RNA and protein synthesis (Krakauer, Zanotto et al 1998). In a sense epigenetics turns the entire concept of evolution upside down. It not only brings into question the legitimacy of natural selection theory but also offers an alternative mechanism on how traits are passed on.

Quite obviously the simple notion that genetic mutation, superimposed on environmental change determines which traits emerge and which organisms survive is a bit lacking as a complete explanation. Still, it is not a model easily abandoned, in part because of its simplicity. The question is: where does one find a good fit among the progressive encoding, epigenetic and natural selection models?


A key element in evaluating evolutionary thinking lies in the concept of feedback. In a sense the epigenesis studies demonstrate that on some level a feedback/registration mechanism does exist between the outside world, the genes and the soma. It seems the genome (ancient and unmalleable as it might seem) is aware of an organism’ response in light of environmental pressures. Yet evolution is such a long term process that one must explain exactly what happens in that interaction; In other words if genes can change in response to the environment why would this only occur over after millions of years or in punctuated manner during dramatic environmental shifts?

One possible resolution is to assume, in line with the progressive encoding theme, that environmental changes can, over time cause genomic discord, featuring disturbances in the alignment process in the form of negative feedback between organs and genes which leads to disturbances in the pristine structure of the genetic code. If the genome and soma do communicate with respect to prolonged hormonal, anatomical and physiological duress might it begin to quaver a bit, producing noise in the system? Furthermore, might the noise go unresolved for long periods of time, perhaps exacerbated so much during environmental disasters such as glaciation, or volcanic-induced shifts in terrain that the amount of noise increases that genetic restructuring is made more rapid? In other words, with greater genomic discord do mutations reduce so much noise as to produce a leap forward of manifest traits by the thrust of emerging information content?

Looking at evolution in information terms allows for the implied purposefulness seen in epigenetic studies. In some sense this idea is reminiscent of Freud’s tension reduction theory, in that the organism can be viewed as a physical system governed by homeostasis. If feedback communications between genes and soma do exist, then the genes, “concerned” as they are with survival and propagation of the pool would tend to process threats to that mandate.

One way to prove or disprove an information-based model of evolution would be through research; specifically around the question of whether dramatic or prolonged environmental changes correlate with increased errors in genetic/molecular alignment, skewed reactions in mRNA or proliferation of discard genes that appear to have no influence in trait manifestation, but can signify an increase in noise in the genomic system. If such a cause-effect tumult can be proved to exist, there might be a theoretical shift beyond the scope of natural selection and¬†epigenesis toward an information-based model of evolution that assumes environmental shifts, increased organic duress, an increase in somato-genetic uncertainty can lead to an increased probability of evolutionary change. If so, then perhaps, Heisenberg’s description of Information Theory as the “theory that decides” might prove accurate.



Dawkins, R. (1976) The Selfish Gene. Oxford University Press

DePaolo, R. (2007) Evolution, Information and Personality; Toward a Unified Theory of the Psyche, Universal Publishers.

Fox, S. (1977) Dose, K. Molecular Evolution and the Origin of Life. New York, Marcell Dekker Gauthier, P. (March-May 1990) Does Weismann’s experiment constitute a refutation of the Lamarckian hypothesis? Florence, AL Beta Beta Biological Society (12) 6-8.

Jablonka, E. (2006) Evolution in Four Dimensions; Genetic, Epigenetic, Behavioral and Symbolic Variation in the History of Life. Cambridge, MA MIT Press

Jablonka, E. Lamb, MJ. (1995) Epigenetic Inheritance and Evolution. Oxford University Press

Krakauer, DC Zanotto, PM, Pagel, M. (1998) Prions progress; patterns and rates of molecular evolution in relation to spongiform disease. Journal of Molecular Biology. Aug. (2) 133-145.

McEliece, R. (2002) The Theory of Information and Encoding. Cambridge Press

Muller, H.J. (1966) The gene material as the initiator and organizing basis of life. American Naturalist (100) 493-517)

Pierce, JR (1961) An Introduction to Information Theory; Symbols, Signals and Noise. Dover Press (2nd Edition)

Sagan, C. (1980) Cosmos. New York. Random House

Shannon, C. Weaver, W. (1949) A Mathematical Theory of Communication, Urbana, Illinois Press

Shapiro, R. (2006) Origins: A Skeptics Guide to the Creation of Life on Earth. Summit Books

Skinner, M.K. (2015) Environmental epigenetics and a unified theory of the molecular aspects of evolution: a Neo-Lamarckian concept that facilitates neo-Darwinian evolution. Genome Biology and Evolution. Cary, NC Oxford University Press.

Wald, G. (1954) The Origin of life. Scientific American

Weismann, A. (1889) Essays Upon Heredity. Claremont Press.

PDF Download    Send article as PDF   

Modern Media and The Evolution of Personality

October 23rd, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 12 views | Print this Article

Modern Media and the Evolution of Personality

by Robert DePaolo


This article discusses a potential restructuring of the psyche in response to the increased volume of media, including the monitoring of behavior through various techno-media outlets. The argument is made that given current trends, the triadic id-ego-superego personality structure could regress in a maladaptive sequence of extreme inhibition, sociopathy and finally broad social apathy.

Sigmund Freud’s theory of personality was part biological and part social (Laplanche, Pontalis 1973). He proposed that the id was the bio-natural component of human nature that did not necessarily jive with the demands of complex human society and was oblivious to phenomena such as ethics, mutuality and restraint. Being a neo-Darwinian, he saw this as an aspect of mind designed to aid in the survival and propagation of the human species. He also believe (somewhat ironically) that the id actually provided an ergonomic impetus for all psychic functions, including the conscience (Freud, 1920), (Freud, 1933)

Conversely, in Civilization and its Discontents he viewed the ego and superego as social phenomena that only arose after agricultural society became more complex, sedentary and interactive (2002 – reprint). Large, diverse groups of people from different backgrounds united only by walls tilling the soil, building walls, temples and pyramids, having to combine forces in war against rival armies might not know or like one another, might resent the differences between their various gods, rituals and languages, but they still had to work together to sustain their proto-sovereignties.

While the id would not, as per its basic biological topography, change over time the ego and superego did evolve in accord with socio-cultural changes. Behavior forbidden in one era might become not only tolerated but common place in another. Superego evolution is of particular interest, since it provides a moral frame work for both the individual and society. As Freud wrote, it can be too rigid, for example when mores and the influence of conscience creates such behavioral restraint as to stifle creativity and paralyze the functional psyche. Or it can be too lax such that diffuse primal energies cannot be systematically extricated for socially productive pursuits (Freud – 1987 reprint).

In fact the triadic mind operates by an algorithm of proportion. While laws, religions and ethics are created to maintain order, rescue the soul, and preserve social probity, they all derive from a bio-social blend of instinct, vigilance and reason that provides not just order and spirituality but as Freud suggested, marriage, art, politics, humor and music (Matte, 2001), (Glover, 2014) Since the relation between the ego and superego will shift with changing times one might expect the current (and future) proliferation of media to effect the relationship between the two.

Forward and Back…

There seems little question that a major shift in the psyche has paralleled the spread of liberalism in western society; a phenomenon not confined to the post 60s peace-love-and drug use cultural explosion. While anyone alive during that time is aware that in prior decades sex without marriage, ribald media, drug use and other liberal manifestations were taboo, societies had been veering off into a liberal direction for centuries. Much of this was in response to cruel, punitive policies enacted by various social agencies and systems. In a limited context liberalism served humanity well. It also had a somewhat paradoxical effect on human culture. On one hand it led to disease (West 2008) substance abuse and social dependency (Rossiter, 2014). On the other hand, newer forms of art, music and literature emerged. It is a theme recurrent throughout history. New freedoms create both cultural progress and social duress.

Arguably, in more recent times, there has been a contraction of liberalism as a result of increased media scrutiny. As more subtle and sophisticated gadgets have been developed to monitor, view and record behavior the clouds of social rejection, defamation and ostracism have loomed. Having lived through an extended period of liberalism, man now faces an abrupt U turn. Like the imminent rampage of converging tectonic plates the previously liberated ego is about to collide with a newly aroused and dominant superego.
The end result might be manifest in various ways. One possibility is that acts of hyper-surveillance will be heavily reinforced by society and consequently be shaped into a fairly permanent, habitual behavior pattern. Just as B.F. Skinner’s pigeons obtained food pellets rewards by pecking at a colored disc, so shall the ministers of surveillance be reinforced for pecking away at the reputation of others. Human behavioral dynamics tends to support that assumption.

The Fruits of Observation…

It seems our primate origins sometimes clash our democratic principles. The very makeup of our genes suggest we are in fact a duality. While the notion of equality goes as far back as the reign of Darius II in Persia and certainly gathered historical steam under the collective ideas of John Locke and Thomas Jefferson we also have a fairly ingrained tendency toward social hierarchies. Some people are considered more beloved, more iconic (or less worthy) than others. That unfortunate polarity has persisted throughout human history.

An inevitable byproduct of a hierarchical mindset is competition, i.e. the quest to obtain more assets and social feedback than the other person. Any number of rewards, including fame, fortune and sexual access will arise from that. So the person who seeks out and records bad acts by others can attain a degree of notoriety. Obviously, as per the first amendment, some acts of surveillance are necessary to prevent tyranny and can obviously serve society well. On the other hand, there is, like all other aspects of human experience a threshold beyond which a necessity can become a liability. Moving beyond that point could have psychiatric as well as social implications.

For example it could lead to social isolation. The extreme inhibition resulting from fear of exposure would tend to manifest itself in idiosyncratic thoughts, self-socializing, excessive fantasy and ultimately social-inadequacy. To the extent that social inadequacy can be a prelude to social misperception and conflict it also augurs poorly for the future of human culture.

This is true with regard to both journalistic and social media. The latter is virtually risk-free regarding the expression of imprecise, often critical, at times incendiary language. That has significant implications. A “blogger” does not have to read facial expressions, gauge vocal tone or determine other signs of emotional expression on the Internet, which is a safe zone of hostility and deception. Hiding behind technology precludes retaliation. Being able to “trash talk” without consequences also reinforces negative expressive habits that in time can become entrenched as high probability behaviors.

The problem with any technological advancement is that no matter how progressive it can never exceed the parameters of human nature. We are finite creatures with a substantial but circumscribed temperament and range of capacities and we really can’t go beyond that without jeopardizing our social and political equanimity. The rapid development and enhancement of tools by which to observe one another cannot unfold without negative consequences. The blogger might avoid repercussions. Society will not.

The key in all this is whether the ultimate benefit outweighs the cost. We invented automobiles and airplanes that enable us to travel to all ends of the earth at breakneck speed. That led to expanding economies, greater access to medical facilities, and vast interaction among peoples from other lands. Despite the fact that many thousands of people die each year as a result of these modern inventions does not deter our need and appreciation for them because we know full well that the social, temporal, medical, vocational and political advantages outweigh the risk.

Secret Sins…

Another possibility also derives from a human duality. While the ego is described by Freud as a creative regulator of psychic energy the personality has another feature, best captured by Gestalt theorists and Jungians, the former of who wrote about a bad me/good me dichotomy within the psyche (Perls, 1957), the latter about a psychic component he referred to as “the shadow” which he felt was the seat of creativity. (Jung, 1951) By their reckoning both aspects are necessary for holistic, integrative psychological functioning. This concept is a bit more concise than Freud’s concept of the id but in both instance the existence of the primal aspect of human nature is deemed necessary for psychological functioning.

Yet not all negative impulses can be channeled into prosocial pursuits. Like miscellaneous (throw away) genes that have no apparent function, some id manifestations are fairly superficial. They have little overall effect on either social norms or creative pursuits. These are often manifest as mini-anti-social actions almost everyone engages in for a variety of reasons, including release, convenience, socialization, identification and need gratification. In other words we all exhibit behaviors derived from the “bad me” – regardless of how moral we claim to be. Speeding on a highway, expressing frustration privately to a friend using foul language, telling off-color jokes, engaging in “off limits” flirting at the office, ordering a fat-filled donut despite doctor’s orders: whether these acts serve any psycho-social benefit is uncertain but they do serve to assert one’s individuality in the face of social controls.

In that context, excessive surveillance can put too many restrictions on such behavior and lead to the an unraveling of the collective psyche. The question becomes; why should even low-end, anti-social behavior be tolerated?

One possibility is the quest for independence. We are, as a species rather continuously tense and prone to both pessimistic and optimistic cognition. For a creature at once dependent on the security of the Hobbesian state yet absorbed by the quest for individuation such acts separate us from the pack. Conformity is an act of compliance. As such it has no aggressive component and cannot resolve tension-induced conflict. On the other hand marginal rebellion against rigid norms does precisely that.

One can extrapolate from that regarding the future of human psychological development. If such pockets of expression are necessary yet increasingly stifled by the proliferation of surveillance a potential sea change in the psyche could occur. Some of those changes might prove to be problematic.

As discussed above, the fear of being monitored constantly will skew the psychic balance toward superego dominance. That in turn will produce behavioral inhibition, social withdrawal and a decline in creative risk taking. There will be a tendency to think more in terms of future consequences rather than in the present. In other words, remove impulse and you remove present-sense thought and action. In that context we could become slaves to social scrutiny, thus rendered incompetent as solving problems in the here-and-now. Parenting, education, artistic expression, financial exploits, relationships – virtually all human endeavors will accede to the mandates of a pathologically blown-up superego. But that is perhaps only the first step toward a more ominous outcome.

A Second Adaptation…

An analogy to Newton’s third law of motion might be applicable, i.e. that for every action there is an equal and opposite reaction. In behavioral terms – one might expect to see a massive, rebound effect typified by the dis-inhibition of behaviors, emotions and attitudes. There is research to support that assumption (Keltner, Grenfeld et al. 2003). It portends a heightened rate of sociopathic behavior whereby we could conceivably become morally oblivious to one another. Allegiance will shift toward technology (which we will come to trust) and away from people (whom we won’t). Beyond that, new moral standards and restrictions on human behavior would be created that no one could possibly meet.That could lead to a broad inter-species disdain – a sense that we are all bad actors. Mass estrangement could result, which could exacerbate violent and/or amoral tendencies. However it is possible the psychic transition would not end there.


A third possible step in the psychic sequence has to do with an information glut. Like money, the value of information will decline in proportion to its increasing volume. That is because, in Information Theory terms, more input creates more noise/uncertainty, thus less pure, encodable information (McEliece 2002). The impact of info-volume could create a trend toward what sociologists have termed “anomie” – a wide spread cultural disengagement devoid of meaning, passion and vigilance. Under such conditions there will be an increasing lack of distinction between moral and immoral, beauty and ugliness, quality and sloppiness, attitudinal focus and general apathy. The new breed of techno-sapiens will be tethered to nothing but love for the technology itself. The dreaded, unexpected consequence of hyper-information (surveillance and otherwise) which purports to make us more aware and better citizens will have a reverse effect. Instead we will become desensitized to input; unmoved, unable to determine relevance or gain a sense of historical, cultural and personal time and place: a final countdown to apathy, with far too many of us (previously) socially dependent creatures now plagued by emotional futility and depression – upright walking beasts unable to make their way through the grasslands of history for lack of an experiential anchor point.

Hopefully the possible events outlined here will not occur, but one can anticipate that some negative after-effects will result from hyper-information. Whether that means we will abandon humanism and self determination in favor of a new and subservient technological idolatry is anyone’s guess.


Freud, S. Beyond the Pleasure Principle. In On Metapsychology (1987)

Freud, S. Civilization and its Discontents London, Penguin Books ( 2002 Reprint)

Glover, N. (2014) Freud’s Theory of Art and Creativity: Essay in Psycho Media; Arts and Representations; Excerpt from Psychoanalytic Aesthetics: The British School.

Jung, C. (1951) Phenomenology of the Self. In: The Portable Jung. p 147.

Keltner, D. Gruenfeld, D.H. & Anderson, C. ( 2003) Power, Approach and Inhibition. Psychological Review 110 (2) 265- 284.

Laplanche, J. Pontalis, J-B, (1973) “Id” The Language of Psychoanalysis, London, Karmac Books

Matte G. (2001) A Psychoanalytic Perspective on Humor. International Journal of Humor Research 14 (3) 223- 241

Perls, F. (1957) Interview with Adelaide Bry in Gestalt Kritik, where Perls stated: “Resistance consists of a impulse and resistance to that impulse considered like a dichotomy. Resistance is frequently referred to as “bad” and in the context of regulation grows into just personal dictates of the client and not the therapist. Taken as a polarity it is as integral to health as the trait being resisted.”

Rossiter, L. (2013) The Liberal Mind. On-line article on liberal

West, M. 9 (2000) Sexually transmitted disease are result of liberalism. Article in, March 15, 2000

PDF    Send article as PDF   

Zoom Lens Cognition: A Psychological Adaptation

June 27th, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 5 views | Print this Article

Zoom Lens Cognition:

A Therapeutic Adaptation


by Robert DePaolo


This article discusses an adaptive cognitive mechanism, applicable in therapeutic settings, whereby the client/patient learns to make experiential discriminations based on the alternating need for broad/encompassing decisions vs. narrowly focused decisions in adapting to psychological circumstances and maintaining emotional stability via an intact, functional ego. This involves a conceptually simple self regulatory process and a modification on the classic definition of the ego.

The Ego: Defined and Simplified

The Freudian concept of the ego is complex, in as much as it involves the orchestration of experiences, emotions and actions across a broad range of circumstances and social norms (Henriques, 2013). In its essence, the ego is presumed to be a moderator between the instinctive component of mind, which worked in the wild, indeed enabled our species to survive, and the conscience-driven mind, adhering to broad moral and ethical concepts inherent in increasingly complex human societies.

The psychoanalytic definition of ego is as a hierarchical mechanism that can override both excessive need gratification and excessive guilt in the pursuit of external and internal equanimity. The ego must see, hear and feel the totality of human experience, including the rewards for abiding by rules and the repercussions involved in not doing so. It involves a balance between self and others, immediate and long term moralities and at times it must operate by proportion. e.g. (Is it wrong to steal food from a grocery store if doing so is the only way to feed one’s family?

Because it involves so many variables, the ego is hard to conceptualize into a therapeutic teaching tool. Perhaps that is why neo-Freudian ego therapies have been so reductionist; for example basic exercises in logic (Wenzel, Brown et. al. 2011). The latest revisions are seen in cognitive-behavior therapy, which uses consequential logic to counteract irrational thoughts and actions. Although this method has yielded positive results, it lacks a definitive structure and lends itself to various interpretations on what is “rational” and what is not. In some instances it also neglects to include existential aspects of the client’s experience.

For example it seems logical to suggest to a client that overeating leads to health problems, poor self image and social rejection. Yet while eating the obese client is actually solving a problem; for example alleviating anxiety and prompting the release of endorphins that can overcome temporary bouts with depression.

Co-considerations of the Ego

It is possible to introduce a slightly different concept of the ego without diluting its importance but to do so requires some discussion of the polarities involved in this aspect of mind. Ordinarily, newly trained clinicians are taught that the healthy ego facilitates patience, weighs all the ramifications and repercussions of experience before acting. It might best be captured in the phrase…all things come to he who waits.

The problem with that model is that waiting does not always equate with a positive outcome. In a competitive society, it is often the impulsive, aggressive and self-centered person who wins. Not only does he obtain material rewards but sexual opportunities and other benefits. In other words it is conceivable that in some instances an ego-dominant personality might be maladaptive – or at least non-productive. That is in part because while Freud assumed human society transcends the primal world many aspects of the wild remain in play. Competition, aggression, jealousy, hording, tribal hostility have never really been erased from human experience. While we discourage these behaviors we also know they have a place when it comes to survival.

With regard to impulse-driven success it could be argued that money and success do not equate with mental health. Yet that argument can refuted. For instance it has been shown that a high correlation between behavior and positive feedback (outcomes) does have bearing on one’s self of hope and behavioral resilience on one hand, and a susceptibility to depression on the other (Zimmerman, 1990). Clearly even an impulsive connection between action and reward can be conducive to feelings of pleasure and equanimity.

A second consideration involves the relationship between the ego and superego; specifically the question of whether and to what degree inhibition and guilt are good or bad for the person. To wit: is it possible that in some circumstances a higher morality can be self destructive?

In either case the need for psychic balance is obvious. However by its sheer complexity the id-ego-superego psychic system seems to require more than logic for a true, useful and functional understanding by both clients and therapists. For that reason it might be helpful to consider an alternative.

Lights, Camera, Stability

All of the above ego-related factors can be encompassed in the word perspective. This has a broad meaning. It is difficult to apply in the ever-changing life of a client because it has little cue value. However it is possible to use a concise cognitive cue and teaching piece to help clients cope with duress and improve the chance of self actualization.

Signal Shifting

The cue referred to here is derived from the simple mechanics of a video camera – the old fashioned kind with a knob to zoom in and zoom out depending on the need for varying visual perspectives. In some sense a zoom mechanism is consonant with how the human brain works (Leukowicz, Ghadanfar 2009). We alternately narrow and broaden neuro-experiential searches depending on circumstances (Spector, Maurer 2009). In fact it has been argued that one function of emotion is to create a narrow focus because when faced with flight or fight exigencies we must act quickly and focally (Platonic contemplation does not often work when one is chased up a tree by a leopard)

Much of human experience (and for that matter the history of clinical psychiatry) revolves around the mental dexterity involved in shifting between broad and narrow concerns. Knowing when to block out superego-driven psychic noise in adhering to focal goals in order to problem solve and instate a hope inspiring response/reinforcement relationship vs. when to await longer term considerations is quintessentially important. There is nothing new in this assertion, as it permeates the writing of Jung via his polarities theme (Sandford, 1980) Adler, as per his social interest concept (Watts, 2003) and Erikson, with his stage-oriented description of perspective (Gross (1987). As discussed earlier, even behaviorists have demonstrated the importance and benefit of language as a mediating mechanism that reinforces secondarily while also enabling the person to postpone reward attainment (Boersman (1966).

The zoom lens cueing system is really a mechanism to prompt effective “cognitive switching.” It teaches the client how to apportion immediacy and perspective on the notion that both can be appropriate and necessary in sustaining mental health. Use of such a simple concept might be beneficial in helping clients switch between these two mindsets via cognitive streamlining, and avoid being caught up in the muddle headed vernacular of psychotherapy and serve clients with varying levels of cognitive ability across a wide variety of circumstances.


All forms of counseling are arguably didactic. Even client centered and psychoanalytic methods involve the teaching of logical thought patterns, self expansion and other habits deemed necessary in maintaining mental health. However not all therapies use a concrete cueing method to fortify the ego. In fact, at the end of most successful therapeutic forays clients are hard put to describe the formulas that “worked” to get themselves on track. Even the rational therapies tend toward dialectic interactions whereby clients’ irrational thought patterns are challenged in various contexts. No single cue or concept with the potential to govern adaptation across circumstances is typically taught. That might be why even many therapists cannot describe how or why their clients improved. In that sense a visual, concrete teaching model based on a “zoom lens” adaptive cue might be effective.

Application of a “zoom cue” might be exemplified in the following way…

Having discussed the nature and origins of stress leading to the referral as well as the ultimate desired outcome, the client/counselor can eventually gravitate toward discussions of whether “resolving your problem requires turning the zoom up or down.” For under-assertive, anxiety-prone clients the development of narrow, goal-oriented “close up zooming” might be the didactic focus. For impulsive, aggressive clients with depleted ego functions focusing on a distal view might be more therapeutic.

Ultimately all clients could benefit from being able to assess when the narrow vs. broad zooming is most beneficial. It would entail an elasticizing of the ego to serve both self and society- as Freud ultimately intended.

Clinical Considerations

It might seem a bit simplistic to state that overly narrow and/or broad mindsets are at the core of psychopathology. Yet a glance at the bipolar nature of psychological disorders does lend support to that idea. Aside from the psychoses – which are by now viewed as having biological causation, a great many of the diagnostic categories in DSM IV bear some relationship to either narrow or broad psychological thinking (Payne, Hirst, 1957) One type involves the narrow (id-dominant) style; for example explosive disorders, borderline personality disorders, anti-social personality disorders and narcissistic personality disorders. In those instances, self concern and the exclusion of broader considerations tends to lead to not only pathologies but to antisocial behavior patterns that lead to negative outcomes, such as drug addiction, incarceration etc. Meanwhile the anxiety disorders are often accompanied by overly broad, inhibitory patterns, whereby the person is virtually blocked in his attempt to meet needs due to fear of retribution, rejection and guilt. In such cases not only is the person psychically immobilized but is also rendered incapable to being “narrowly selfish” enough to meet his needs, self actualize and develop an enduring sense of hope and emotional resilience.

One of the most significant byproducts of over-inclusive (distal-zoom) cognition is depression. The inhibition and extreme restraint that results from overly broad thinking carries with it a number of secondary effects. In not being able to meet needs and self actualize the person with an excessively distal zoom perspective can end up with a sense of futility, hopelessness and self deprecation. In psychobiological terms this can exacerbate feelings of depression because just as an adequate relationship between behavior and positive feedback leads to the production of catecholamines (pleasure-sensitive neurotransmitters (Guerra, Silva 2010). so too could an absence of behavioral success create an opposite effect. In that sense there is an obvious connection among hope, chemistry and the “cognitive zoom”

Cognitive therapies address this issue in broad terms; mostly with regard to the client’s specific experiences on a moment by moment basis. And there are templates used in some forms of this method- as for example the phrasing employed by Ellis in Rational-Emotive Therapy (Ellis, Dryden, 2007). However the use of a zoom lens cue as a self regulatory guide offers an economic, heuristic mechanism with which to elasticize the ego and insulate clients against self-destructive thought, emotional and behavior patterns and various pathologies.

Learning and Simplicity

A relevant question to ask is whether a simple cognitive cue is sufficient. One possible answer lies in the nature of learning itself. Educators have historically (and effectively) used concise poly-applicable symbols and cues to enhance both learning and memory: for example the alphabet song, the grammatical rule “i after e except after c”, and various math formulas such as Pythagorean Theorem. Being able to whittle down information into a single rule or concept does improve learning and to the extent that counseling involves learning such a mechanism might help improve clients’ mental resilience.

The Counseling Process

As with all therapeutic interventions, the use of a “formula” can never replace the necessary preliminary features of counseling such as empathy, relationship building, gathering of background information etc. However at some point a zoom cue model could be effective, with the counselor perhaps asking questions like the following…

1. Do you believe it is sometimes okay to look out for your own interests – even if that might seem selfish?

2. Do you believe that it is sometimes appropriate to delay pleasures because a greater good can come from patience and perspective ?

3. If both of those statements are true does it not follow that life involves a proportion between the concerns of self and broader concerns, between immediate goal seeking and delaying immediate pleasures in favor of greater rewards down the road ?

4. Do you think therefore that mental health means figuring out how and when to employ one or the other?

5. If so, we now have our “golden mean” – our ideal proportion, So let’s now review what’s happening in your life and see if we can figure out when to narrow the zoom or widen it as well as the reason for those decisions.

It could be argued that the above model runs the risk of oversimplifying life; especially since human experience entails so many gray areas. Yet psychotherapy need not address every need or set of circumstances. It is not a religion. What it can do is provide a learning tool useful in dealing with occasional duress, ameliorate the oftentimes excruciating juggling act between self-actualization and social responsibility and aid in the lifelong quest for psychological equanimity and social adaptation.


Boersman, F.J. (1966) Effects of delay of information feedback and length of post-feedback interval on linear programmed learning. Journal of Educational Psychology. 57, 3, 140-145

Ellis, A, Dryden, W. (2007) The Practice of Rational Emotive Behavior Therapy (2nd Edition). Norton.

Gross, F.L. (1987) Introducing Erik Erikson: An Invitation to his Thinking,. Lanham MD. University Press of America

Guerra, l.G.G.C, Silva, M.T.A. (2010) Learning process and the neural analysis of conditioning. Psychology and Neuroscience. 3 (2) 195-208

Henriques, G. Theory of Knowledge; The elements of ego functioning. Article on Internet posted June 27, 2013

Lewkowicz, DJ, Ghananfar, A.A. (2009) The emergence of multisensory systems through perceptual narrowing. Trends in Cognitive Sciences. 13 (1) 470-478

Payne, R.W., Hirst., H. (1957) Over-inclusive thinking in a depressive and a control group. Journal of Counseling Psychology 21 (2) 185-188

Sandford, JA (1980) The Invisible Partners: How the Male and Female in Each of us Affects our Relationships. Paulist Press.

Spector, F, Maurer, D (2009) Synesthesia: A new approach to understanding the development of perception. Developmental Psychology 45 (1) 175-189

Watts, R.E. (2003) Adlerian Cognition and Constructionist Therapies: An Integrative Dialogue. New York, Springer

Wentzel, A. Brown, G.B Karlin,, B. (2010) Quote from the Therapeutic Manual for Cognitive-Behavior Therapy: “Cognitive Therapy (CBT) is a structured, time limited, present-focused approach to psychotherapy that helps patients develop strategies to modify dysfunctional thinking patterns of cognition.”

Zimmerman, M. 1990) Toward a theory of learned hopefulness: a structured model analysis of participation and empowerment. Journal of Research in Personality. 24, 71-86

PDF Download    Send article as PDF   

Psychiatry and the Evolution of Human Language

May 2nd, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 6 views | Print this Article

by Robert DePaolo


This article discusses the therapeutic effect of language in an evolutionary context. The point is made that while the evolution of human language served a number of social/interactive, analytical and action planning purposes a parallel benefit accrued as well, which enhanced the capacities for self control as insulation against environmental dangers and setbacks and the ability to use internal deception to provide psychic endurance past the point of apparent helplessness.


The origin of human language is a very complicated topic. It is one reason why some insist that man in some ways transcends the rest of the animal kingdom, (Ptolemy 2009). Indeed a creationist might refer to human language development as proof of a higher power or at the very least a super-organizational entity.

A glance at the language development of a young child provides grist for the mill. The infant begins with a vast number of loosely connected neurons in its brain which are barely able to coordinate basic motor skills for the first several years In that same time frame the child acquires exponentially a capacity to think and speak in terms of past present and future.

This is particularly interesting because the child’s brain is developing on several fronts during the first few years. Perceptual circuits, motor circuits, emotional (mid brain) wiring, the arrangement of vertical and horizontal pathways in the cerebral cortex are also developing, and must in order for cognition (which depends on such experiential content) to expand. Yet while the child does not go from taking his first steps to running a 9.5 hundred meter dash in three years his language skills broaden at an astonishingly rapid pace. His brain development is skewed in the direction of language acquisition. One can speculate about how this ties in to human evolution.

The Anatomy of Pleasure

Some aspects of human language evolution were predictable due to emergence of opportunistic human anatomy. For example, the lowering of the hyoid bone in the throat enabled our ancestors to extend vowel sounds. This would have enhanced their phonetic variety and provided several advantages. One would have been esthetic.

Due to enhanced breath control the capacity to sustain and vary vowel sounds could have produced an auditory end product (including song) attractive not only to members of the opposite sex but to all members of the social group. The reason for this assertion derives from a common theory of esthetics, stated at various times by St. Augustine, Descartes, and in more recent times by Berlyne (1960), (1975) and Zeki (2009). The theory states that any stimulus that includes both familiarity and novelty will be perceived as pleasurable and that the pleasure could take any form or serve any purpose, including music, inspiration, seduction, even humor.

The underlying reason has to do with the two aspects of the pleasure response – search and closure. Pleasure is not a free experience. It requires some degree of perceptual work Freud (1928). One must earn it by converting irresolution into resolution. Faced with a totally unfamiliar stimulus complex the perceiver will tend to withdraw from it because he cannot to “wrap his brain” around the input. By the same token encountering a completely familiar stimulus complex will also lead to avoidance because no perceptual work is required. Any experience that precludes a successful search cannot result in resolution, i.e. closure. In that context, the ideal “pleasure proportion” featuring a workable, resolvable combination of sameness and novelty would have accompanied human expression from the outset.

The First Crooners

Many believe descent of the hyoid bone first appeared with Neanderthal., though it seems rather unlikely that while Homo erectus’ hyoid position might have precluded extended breath control and vowelization his cognitive abilities could have developed without some sort of linguistic impetus (bearing in mind that it is also possible to speak without vowels; as exemplified by the click language of the !Kung people in Africa).

Whether or not hyoid descent was confined to Neanderthal and Sapiens is not certain but considering the Darwinian concept of a conversion, whereby a trait initially favoring one skill is eventually used for others, it seems likely that early versions of both species quickly learned to profit from this newfound oral capability. It is also likely that this capacity was refined over time alongside its physiological correlate, upright walking (bipedalism allows for such changes in bodily structures, including the spinal cord, digestive tract and voice box). This trait obviously passed on to our own species.

The Expansive Song List

As his phonetic breadth increased early homo sapiens’ expressive skills were more finely orchestrated. That would have enabled him to learn search and closure expressive sound patterns that were attractive to listeners. Considering man’s primate-consonant penchant for mimicry this pleasurable communicative trick would have caught on. Early man would have created a new, dual aspect of experience combining pleasure with object, action and descriptive labels that could be applied to experiences and people.

At that point the stage was set for a new version of “alpha” who gained his or her influence not by the typical primate criteria of physical prowess and dominance but by esthetic expression. Over time, as human society expanded and roles proliferated the expressive alpha tactic filtered down through the ranks. Musicians, prophets, healers, ministers, writers, poets, minstrels, actors, politicians mind healers and eventually psychotherapists were included in the neo paradigm, each able to establish themselves in the hierarchy by word as well as deed.

In terms of that scenario one could argue that the benefits derived from being an “esthetic communicator” would have been enough to set human speech on its developmental path. Unfortunately, the analytic complexity of human language makes that doubtful. Running parallel to the esthetic factor is the element of meaning. In order for the listener to respond to the esthetic value of language he must understand the concepts and vocabulary of the statements. In terms of our own sociobiological evolution that required an additional capacity within the brain to gauge the level of intelligence and overall cognitive demeanor of the listener as well as an ability to process his facial expressions, body posture and movement trends in distinguishing approval/comprehension from confusion/ frustration and adjust the message accordingly.

The Evolution of Empathy

With regard to how, in the course of evolution homo sapiens developed a capacity to read the reactions of others, one possibility comes to mind. The brain of a primate is large relative to body size but while that allows for a variety of perceptual, motor and emotionally induced behavior patterns the most basic function of the primate brain might lie in a capacity to imitate.

Contrary to the oft-stated notion that the “brain is too complex for humans to understand” (Doglas 2006) one can reasonably view the brain as a kind of copy machine. If not for that foundation long term memory would be a pipe dream and our emotional depth would be restricted to momentary fight/flight experiences. When inputs impinge on the brain it appears the brain’s neuronal patterns replicate the energy signature patterns of the outside source in isomorphic fashion (Lehar, 1999). For example the visual energy signals of a tiger are replicated as if the color and dimension of the tiger are housed in the perceiver’s brain. Over time various associations can be developed around that neural process but in the first instance the brain copies what it sees, hears and feels.

Since imitation is conceivably neuro-primal one can assume that one person interacting with another is in some sense incorporating that person into a neural scheme via a representation process, e.g “The other person, place or thing becomes me.” This notion is supported by studies showing a tendency for people to mimic unconsciously the patterns of others. For example the menstrual cycle between females living together tends toward temporal synchrony as per the so-called McClintock effect (Arden, Dye 1999). Also the emotional arousal levels of one person will tend to rise and fall in sync with those of the person with whom he is interacting (Kovalinka, Xyglatas et. al. 2011). This is one reason why social interaction can be so stressful to therapists, or for that matter anyone involved in a commiserative exchange.

However another question arises; specifically what can be inferred about how internal mental processes such as imitation, empathy and communicative esthetics play out in evolutionary terms?

The Ice Age

The climate was in an extreme cooling phase during the first human occupation of Europe and what is now called the Middle East. Living on frozen tundra had enormous ramifications for survival. In order to understand this, one must look beyond traditional paleo-anthropological concepts toward one more psycho-anthropological. This is not to minimize the challenges of the physical environment. For example with a scarcity of plants the Neanderthals and Cromagnons had to rely more on a diet of meat. That altered their behavior patterns in a quite natural way. Also, most of the game animals at the time were massive and with only Achulean axes and perhaps spears, early man had to refine his hunting and planning skills.

There was a problem inherent in that. Early man had to work extremely hard and long to carry on these and other arduous, lengthy tasks yet cold weather has a dampening affect on hormones, activation level, metabolism, mood and motivation, especially if a species migrates from the African savanna to western Europe without sufficient time for physiological adaptation (Leppauoto, Hassi 1991). In modern times we recognize that seasonal shifts from fall to winter are often accompanied by psychopathologies. While naturalists typically distinguish mammals from reptiles based on our constant body temperature and their lack thereof, mammals also experience downward physiological shifts in cold weather. Indeed the fact that human metabolism slows down considerably in winter and speeds up in spring is one reason why suicides tend to be more frequent in the cross over between the two seasons. The transition from a dormant season to an active season can be physiologically as well as emotionally overwhelming for depression-prone individuals.

In order for early humans to carry out the strenuous activities needed for survival in a cold climate (bearing in mind that there was no techno-culture to offer support) required a reliable, internally driven mechanism by which to summon and sustain activity levels when nature created an opposite effect. In simple terms it required the development and implementation of an energy regulating mechanism – the psyche.

The emergence… hesitate to say ‘evolution’… of such an ergonomic aspect of mind was probably less than completely adaptive because, as both Freud and Maxwell suggested, energy is a neutral phenomenon that can create or destroy. The new aspect of mind was more complex and vagarious than, say, upright walking or tool use and had to be channeled properly to advance the cause of survival.

An Adaptive Paradox

It is generally assumed that Neanderthal had a sense of his own mortality, as per his habit of burying the dead (Than 2013). That speaks to his psychological sophistication Yet in some ways mourning is an irrational behavior. The deceased is no longer functional in the group since his absence has no subsequent influence on survival and adaptation. Moreover since man was/is quite familiar with his own mortality there is no logical reason to feel emotionally “robbed” by the death of a family member or colleague. Yet death is a devastating experience, largely because we internalize it beyond mere logic. “How can I bring him back – through memory and legend?”…”What if I’m next to go?”…

Such contemplative, emotional largesse, combined with a cold climate could have led to feelings of helplessness, futility and depression and issued a double whammy if not for an adaptive conversion that modern clinicians will undoubtedly recognize.

The Self and the Artist

The adaptive mechanisms enabling early man to overcome these obstacles were likely in the language domain; not just regarding communication but three other forms that create a capacity for self deception, specifically; art, self-talk and self regulation. With respect to art; having an internal language capacity would have led to an ability to imitate and represent the outside world. In his cave paintings early man could depict the various animals around him, and create symbols of people within his social group. He might even come up with the idea of an alpha even higher in rank than those in his tribe, who could not only rescue him from his obscurity within the group but orchestrate all of nature – a god.

The artistic depictions at Lascaux and Altamira might have recaptured life but perhaps only due to a prior skill set. In order to find his inspiration the artist had to preplan as to subject, background, color scheme etc. In so doing he would have begun with a self talk exercise (not necessarily overt but perhaps covert) whereby his expanding frontal lobes vacuumed associations from Broca’s site in the fronto-parietal lobe so that the vast inhibitory circuits of the former could whittle down language into an inhibitory, fractional version of speech resulting in inner contemplation.

Through that guiding process size, angulation and action depictions could be altered. All aspects of life could be restructured so that victories could be imagined, dangers avoided, animals spirits created from scratch, joyful outcomes conjured up in the absence of direct experience or concrete possibilities.

The benefits of this new, internal capacity would have proliferated as a result of social influence and mimicry. It would have facilitated both artistic expression and, through its capacity to re-shape experience, also enabled our ancestors to summon and sustain arousal, motivational and behavioral activity levels in a sparse and hostile terrain.

Up to that point cruel nature had decided which creatures could carry on and which would fade into oblivion. Then man came along and was able to outwork and to an extent circumvent its whim.

From the Cave to the Couch

Any psychotherapist can see the connection of the above elements to the present day use of defense mechanisms in providing emotional equanimity during difficult times. Homo sapiens appears to have embarked on a somewhat ironic evolutionary path. In taking a ramp off the road of natural selection we developed a secondary, alternative and internal world which we can distort experience at will in accord with our needs and frailties. Darwin’s theory of natural selection holds that organisms survive through random mutations that over time are selected or rejected, based on their juxtaposition with the outside world. That rule was overturned to an extent with the emergence of homo sapiens. Indeed one could argue that we have actually been able to separate ourselves from the outside world and that human evolution offers a partial exception to the Darwinian rule. Does this mean we are transcendent, or as Darwin and Huxley surmised, merely a biological after-effect arising from the initiative of mindless nature? Such a question is highly philosophical, maybe even moot. Yet it is difficult to refute the notion that the rise of mankind was marked by an ongoing competition between nature and the imagination.


Arden, M/A/ Dye, L. Walker, A. (1999) Menstrual Synchrony Awareness and Subjective Experience. Journal of Reproductive and Infant Psychology 17 (3) 255-65

Berlyne, D. E. 1960 Conflict, Arousal and Curiosity, McGraw Hill

Berlyne, D.E. (1975) Aesthetics and Psychobiology, National Arts Educational Association.

Doglas, Y. (2006) The Paradox of the Brain. Neuroscience. Sept. 15, 2006

Freud, S. (1928) Humor. International Journal of Psychoanalysis. 91-96

Konvalinka, I. Xyglatas, D. Bulbulia, J. Schpdt, U. Jeginda, E-M. Wallot, S. Van Orden, G. Roepstorff, A. (2011) Synchronized Arousal Between Performers and Related Spectators in a Fire-Walking Ritual. Proceedings of National Academy of Science. May 17, 2011 108 (20) 8514-8519

Lehar, S. (1999) Gestalt Isomorphism and the Primacy of the Subjective Perceptual Experience. Behavior and Brain Science

Leppauoto, J. Hassi, J. (1991) Human Physiological Adaptation to the Arctic Climate. Arctic. Vol. 44, No. 2 130-145

Notes; on Man’s Transcendence. Ptolemy, D. Derived from the Film: Transcendent Man, Ptolemaic Productions, Therapy Studios released 2009.

Than, K. (2013) “Ancient Ritual” A 50,000 Year Old Neanderthal Discovered in a Cave in France was Intentionally Buried. Article in National Geographic Dec. 16, 20134

Zeki, S. (2009) Statement on Neuroesthetics. Neuroesthetics. Web. Nov. 2009

PDF Printer    Send article as PDF   

Mass Murder: The Psychology of Prediction and Prevention

February 14th, 2016 by Robert DePaolo | Posted in Psychology | No Comments » | 68 views | Print this Article

Mass Murder:

The Psychology of Prediction and Prevention

by Robert DePaolo


This article discusses the dynamics of human aggression and specifically the proclivity toward killing strangers, via the mass murder scenario. The point is made that a package of traits involving perceived estrangement, pseudo-familiarity toward the victims, a nihilistic outlook (depression), obsession and ego-dysfunction combine to increase the probability of such an act.

Genes and the Ego…

If one asks a liberal democrat why people engage in mass murder he or she will typically say it has to do with the availability of guns in American society. If one asks a conservative the same question, he or she will likely say it has to do with mental illness and that the mere presence of guns does not correlate with the act of committing mass murder- anymore than watching violent movies automatically foments violent behavior. If one were to conduct a statistical correlation study of the number and/or percentage of gun owners vs. the number and percentage of people who use guns to commit mass murder the correlation would be so low as to be statistically insignificant (Peterson, 2014) (Kennedy, Skeem et. al 2012). The same would be true with regard to people diagnosed with mental illness, including psychotic disorders. In fact it takes a fairly organized mind to plan such an act, as well as an intact enough reality orientation to select targets, obtain weapons, plan on time, place etc.

Of course guns are used to kill, and some mass murderers have been presumed to have mental disorders. Yet developing a predictive model by which to identify potential mass murderers would seem to require more than a liberal or conservative argument and perhaps encompass more than gun ownership or severe mental illness. That raises a question. To wit, if neither mental illness nor gun ownership are necessary and sufficient antecedents where does one look to find a preventative model?

Genes and Detachment…

A number of biologists have surmised that imbedded in virtually all organisms is a disposition to behave altruistically toward those within the same gene pool, and/or those with whom one has familiarity on a regular basis (Thompson, Hurd 2013), (Wendell, 2013) – especially if the relationship involves cooperative, mutually beneficial interactions. On the other hand it seems that as the genetic and social ties drift off into dissimilarity and non-mutuality the potential for aggression increases. It isn’t necessarily that our genes foment hate based on the differences between organism A and organism B. It’s just that with differences (real or perceived) the protective instinct dissipates. With respect to the psychology of Homo sapiens it is left to the individual to fill in gaps. Various outcomes can arise from that sense of estrangement.

As the most social of the primates we are extremely dependent on social interaction. As exemplified by a survey conducted by the Self Help Collective, various studies on public speaking and other potentially performance-related experiences suggest humans fear ostracism more than death (2014). When children are neglected or ignored in early development they not only become un-socialized but resentful and aggressive.

In a sense the necessary ties we have with one another comprise our greatest strength; giving us hospitals, caring parents, mentors, supportive agencies, even clerics willing to hear about our sins, trials and tribulations and offer solace. However when that “social compact” (i.e. the expectation of nurturance and reciprocity) is violated it is often perceived as betrayal. That can lead to retaliation.

If detachment revenge were all there is to it, prediction and prevention would be fairly easy and rather ominously, there would be more potential mass murderers. Obviously there is more to it than that. In fact another, paradoxical and dual psychic component seems prerequisite. One part consists of a self-perceived closeness to the target group or individual because it is difficult to hate, let alone kill people who are inconsequential in one’s life. In order to develop the essential mindset enabling one to pass beyond the regulatory barriers of ego and superego the avenger must categorize his victims to a point of utter familiarity. A second part (comprising the paradox) involves the perception that the targets are alien and distressingly unrecognizable.

It is a paradigm adopted by Adolph Hitler, who first espoused clear separation between the “Arian race” and virtually all others, then assigned distinctive traits to his targets (Jews and others) by turning them into them familiar entities. Recognition – separation; these elements can set the stage for ostensibly any anti-social act, including mass murder.

Such a dynamic can be seen with what might be called estrangement murders as well. Mark David Chapman (the murderer of John Lennon) and Ted Bundy, who targeted women who resembled a woman who rejected him prior to his rampage yet were in fact complete strangers are prime examples of this. It has also typified the onset of wars throughout human history, whereby specific traits and distinctions were assigned to enemies – by race, religion, ethnicity, thus releasing the demons within our species to freely kill and maim.

The Freedom of Psychic Rigidity…

Thus far, the elements of estrangement and quasi-familiarity have been viewed as requisite psychological components. In fact there are other variables to consider. One is considered here to be a kind of “theory of everything” with respect to anti-social behavior – the obsession. In order to discuss this it might be best to begin with Freud’s concept of the ego. This notion is a bit general but can be said to correspond to the self-awareness, linguistic, planning and regulatory functions of the prefrontal cortical lobes. Functionally the ego is designed to prevent extremes. It is a psychic clearing house, an arbiter of reason invariably weighing factors such as risk, reward, self image congruence, social ramifications and emotional repercussions in the process of behavioral selection (Strahey, 1990). If it can be bypassed, all those modulating capabilities can go by the boards, making each of us capable of extreme behavior patterns. With a depleted ego man can indeed potentially devolve into a “killer ape.”

The intact ego is pan-influential, so most of us operate via multiple faculties, weaving cognition, emotion, perception, memory and behavior into socially acceptable patterns. That’s why most people with guns do not murder and why most viewers of violent movies do not go out and commit violent crimes. The ego filter is a greater deterrent to sociopathy than gun laws or artistic restrictions. It provides a pro-social tether that can rein in aggression so we can think beyond the moment. On the other hand the tether can be severed by obsession-compulsive features.

In order to override the ego’s considerable breadth and influence requires extraordinary force. An emotion-induced obsession suits that purpose very well. But it must be more than an episodic occurrence produced, for example by situational or “state” anxiety. It must be a fairly ingrained trait, created either by innate temperament, doctrinaire conversions, trauma or neurologically induced rigidity – the latter of which can produce hyperactive, perseverative tendencies in the brain, forcing it to operate on a single track to the exclusion of peripheral inputs and sensations.

Language and Tunnel Vision…

As Simon has demonstrated emotional states tend to create a narrow focus (1967). This trait is adaptive in nature but often maladaptive in society. Being pursued by a predator certainly requires a singular focus on escape. On the other hand many social situations require subtlety even in emotional circumstances. Thus the human animal must be able to modulate what is essentially an adaptive mandate. The only way for this to occur is through the self-regulatory capacities of language. The ego is ultimately a language medium. One must not only have expressive skills, but two other linguistic skills are needed to rein in destructive emotional behavior. One is a capability (honed through practice) to talk one’s self through duress. The other is a set of experiences whereby, through interactions with others, a sense of consensual reality is developed so that the individual can “think as others think” and thereby preclude extreme thoughts and feelings.

All of those skills are ego-driven as Freud suggested, thus language habits (particularly a capacity for self-reflection and regular opportunities to “check in” with others in conversation) are a critical component in deterring the urge to kill.


Seligman wrote about depression as being typically accompanied by a sense of helplessness (1975) i.e. the perception that one’s behavior cannot resolve disparities between needs and frustrations. While depression might seem a harmless emotional state- at least in its impact on others, it is in fact quite dangerous. Running out of solutions leads to nihilism, i.e. a belief that nothing matters. That leads to a ‘nothing to lose’ mindset which can not only remove the fear of retribution but make it seem somehow attractive.


Thus far several variables have been discussed with respect to the psychological makeup of a mass murderer. One obvious feature – aggression – has been left out. One might assume that to engage in mass murder requires an aggressive outlook. There is one problem with that, however. People get angry all the time yet do not act out in such extreme ways. Moreover, mass murderers often do not seem to respond to grudges or momentary pique. Nor do they necessarily have a history of assaultive behavior (Hsu 2012). Indeed the act can sometimes appear random and devoid of imminent rancor toward those around him prior to the act. Thus mass murder might not be purely an act of rage. As strange as it seems the act might be in its essence more cognitive than emotional. The question is…what cognitive process would it involve?

Totem Pole Demons…

Recently, President Obama (and many others) expressed frustration over the fact that mass murder happens much more often in American society than in all other advanced societies – thus his call for stricter gun laws. He is probably wrong about that, since many nations have experienced mass murder, including Russia, England. Germany and many other advanced countries, going even further back than Jack the Ripper. In that context it makes sense to look beyond a specific nation for the solution – indeed to the very nature of our species. Homo sapiens is a primate, or at least shares many neurobehavioral traits with other primates. One such trait is a large brain, which in turn disposes all primates to be highly social. The reason is fairly obvious. Big brains enable their owners to perceive faces, behaviors, physical characteristics, vocalizations and visual inputs in more detail. As a result primates are better able to distinguish among members of the group; for example to determine which ones are dominant, submissive, sexually receptive, emotionally unsteady, intelligent, wily, deceptive and so on. As Pinker (1997) has noted some of the impetus for expansion of the human brain was to enhance social perception to meet the demands of increasingly complex human interaction,.

One of the most significant byproducts of this process in humans is something that can be referred to as the “equi-hierarchy.” In chimp and gorilla social groups there is little question of rank. The silver back runs the show in gorilla groups and the alpha male rules in chimp society. Most of this is based on size and strength. Yet the chimps in particular experience political revolutions from time to time as lower ranked males stage a coup against the alpha male. That is in part because they can see beyond size and strength. They see numbers as an advantage. They can also consider the age and physical decline of the alpha and other signs that he is “losing it” and plot against him in accord with those perceptions.

Human perception enables us to see an even greater number of variables. Size strength, intelligence, artistic ability, social notoriety, sexual prowess, financial status and so many other features are symbols carved out on the totem pole of human experience. The net result is that the human hierarchy is so fluid as to be virtually in the eyes of the beholder. One might even argue that beyond historical stimuli such as the Magna Carta, Locke’s treatise on the social contract or the Declaration of Independence, it was basic human neuro-psychology that created an inevitable drift toward democracy, particularly after the advent of the handgun, which really leveled the playing field.

The equi-hierarchical phenomenon has enormous impact on human thought which is reflected in the obsessive human need for attention, approval, fame etc. It disposes us to feel entitled to totem pole ascendency; not just as a possibility but as a basic right. One byproduct of a doctrine of equality (not just before the law – which was the sole intent of the constitutional framers – but in all aspects of life) is potential conflict; both within the individual and in his interactions with society. This dynamic can provide justification for a mass murderer. The attitude…”if society deems me equal to all others yet I fail to ascend the ranks it is the fault of society and I am no longer obliged to live by its rules” can prevail, making mass murder less an emotional act than an intellectual assessment.

In that context, President Obama might be more accurate to say such crimes are more prevalent in societies with a low emphasis on acceptance, with highly fluid versions of equality and in societies where (perhaps through media-fueled emphasis on universal fame and notoriety) citizens are unable to find satisfaction in life’s smaller conquests and achievements. By that reasoning a philosophical society, or one with religious overtones – for example via the teaching of acceptance and a capacity to defer to a higher power – might well produce fewer violent individuals than a non-philosophical or godless society.

Diagnostics and Prevention…

The ultimate question is whether the above discussion can be applied to prevention. While no system is even close to perfect in predicting human behavior there are criteria that could increase the likelihood of detection and prevention of mass murder. For example an individual might be disposed to mass violence when…

1. There has been a long period of social isolation wherein ego functions could have been depleted by lack of normalizing social interaction and where an internal language capacity honed by faith or moral teaching is absent or diminished.

2. The individual expresses a sense of threat and betrayal due to a perceived, threatening disparity between his aspirations, self perceptions and his idealized status and actual life outcomes.

3. When parents and family dynamics emphasize the “specialness” of the individual despite his or her average traits and accomplishments – which would set the stage for threat inducing cognitive dissonance.

4. When cognitive and language habits reflect the ultra-familiarity with the traits, habits and motives of potential targets with whom the individual has no familiarity.

5. The presence of mental rigidity, in the form of repeated complaints, use of phrases in the characterization of others, reliance on rituals and habits, extreme emotional resistance to change, and even rigid eye focus, i.e. lack of shifting eye movements that correspond to deliberation, the weighing of ideas and also reflect a pause capability between feeling and acting that can indicate working ego functions.

6. The presence of depressive traits, with accompanying feelings of helpless and a nothing to lose mentality. That particular trait removes to an extent the fear of getting caught and punished and skews the risk-reward ratio toward anti-social behavior.

Each of the above factors could be seen in the behaviors of various mass murderers and serial killers. Certain factors might be more pronounced than others; for example some might have ample social contact but excessive rigidity and dissonance between self image and actual status as well as a closeness-detachment fixation toward others. Just how these factors could be used for prediction scenario, either by clinicians, family members, police or the courts is hard to say. It could be a question of proportion whereby some of these factors might be non-existent while others might be pronounced.

This does at least hopefully provide a beginning model to identify individuals at risk that can be used even in non clinical settings as a mean of reducing, if not eliminating the occurrence of such acts.


Freud, S. (1990) The Ego and the Id; The Standard Edition of the Compete Works of Sigmund Freud. J. Strahey (Ed.) W. W. Norton.

Hsu, J. From an Article in Tech. News Daily Dec. 14, 2012. Can Mental Screening Predict Mass Murder?

Kennedy, P. Skeem, J. Bray, B. Zvonkovic, A. Law and Human Behavior.

Peterson, J. How often and how consistently do symptoms directly precede criminal behavior among offenders with mental Illness? Article in American Psychological Association Newsletter 21, 2004.

Pinker, S. (1997) How the Mind Works. W.W. Norton

Self Help Reference: Fear of Ostracism discussion was derived from a list compiled by the Self Help Collective at self help in 2014.

Seligman, M.E.P. (1975) Helplessness: On Depression, Development and Death. San Francisco, W.H. Freeman

Simon, H. (1967) Motivational and Emotional Controls of Cognition. Psychology Review. 174 (1) 29-39.

Thompson,, G.L. Hurd, P.L., Crespi, B.J. (2013) Genes Underlying Altruism. Biology Letters

Wendell, J. (2013) Epigenetics Sheds New Light on Altruism. Genetic Literary Project

PDF Creator    Send article as PDF