Institutional racism

One of the most challenging days of my life was the day I spent in a roomful of lawyers, in Germany. An Army 1st Lieutenant and race relations education officer, just back from six weeks of training at the Defense Race Relations Institute (DRRI), I was assigned to conduct a one-day race relations seminar at the U.S. Army headquarters in Heidelberg. The attendees were the staff of the Judge Advocate General – all of the Army lawyers in Germany, including the one-star Judge Advocate General, himself. Because lawyers join the Army at the rank of Captain, I was the lowest-ranking officer in the room. And I was the only non-lawyer.

I was used to encountering resistance to race relations education, and I knew that leading this seminar wouldn’t be easy. Sure enough, during the morning session, many of the things I said about personal racism were challenged, and I felt like I was being cross-examined. I wondered if my presentation was getting through to anyone. Then, gradually, some of the lawyers present nodded their agreement as I made controversial points, and seemed to be coming around.

When I talked about institutional racism in the afternoon session, I continued to encounter resistance from some of the lawyers. But others began to side with me, saying things along the lines of, “Actually, Tom, he’s right about that” and “Let him finish making his point.” At the end of the day, several attendees thanked me and shook my hand. A week or so later, I got a letter of commendation from the general, stating that it was clear why I’d been chosen “to be an instructor in the difficult subject area of racism.”

The only way that I was able to hold my own in a roomful of lawyers was that the evidence was on my side. I had the facts; the lawyers who argued with me only had opinions. Still today, many white Americans remain blind to institutional/systemic racism and white privilege. They have opinions about the disparities between the white majority and people of color, but they don’t know the facts about institutional racism.

Many of the facts I learned at the DRRI came from the 1968 Kerner Commission report, which analyzed the societal factors that provoked the 1965 Watts riots in Los Angeles. The report disclosed inequities in employment, housing, social services and education, and identified discriminatory practices in policing and in the criminal justice system. The report concluded, “Our nation is moving toward two societies, one black, one white, separate and unequal.” Sad to say, not much has changed since the release of the report.

Institutional racism is a web of persistent, interrelated inequities having to do with housing, hiring practices, education, nutrition, health care, and law enforcement. People who are blind to institutional racism tend to believe that the disparities in wealth and social status are attributable to factors like intelligence and ambition. But, in fact, there is still widespread societal discrimination against people of color. The playing field is still not level.

The “white flight” to the suburbs left many inner cities mostly populated by people of color. Since most school districts are funded by local property taxes, and property in run-down inner cities and pockets of rural poverty is generally less valuable than in white communities, many minority group children get an inferior education, limiting their job prospects. Access to affordable health care, including preventive care, is often limited in minority communities. Many of these communities are also “food deserts,” with no supermarkets to provide fresh produce and nutritious alternatives to the junk food sold in neighborhood bodegas and convenience stores. Not only are job opportunities limited by poor schooling and job training, numerous studies have shown that many employers are unconsciously biased toward white job candidates over equally-qualified minority candidates. The economic inequality between white people and minorities can’t be denied. White people who don’t see or understand the mechanics of institutional racism are likely to lay blame for this disparity – consciously or unconsciously – on the victims of systemic racism.

People of color are disproportionately accosted or arrested, persecuted, incarcerated and killed in police custody, relative to white citizens. This is either because people distinguishable by their abundance of epidermal melanin are “racially” more prone to criminal behavior (as some people still believe), or because our criminal justice/law enforcement system is systemically racist, and in need of reform.

On race relations

As an Army officer, I was trained to be a race relations educator at the Defense Race Relations Institute (DRRI) in 1972, and spent a year in Germany leading race relations seminars. I’ve written in a previous post (“Who is a racist”) that it’s not simply a matter of whether one is or is not a racist. Personal racism isn’t a binary, either/or phenomenon. Racism exists along a continuum, between “hardly any racial bias” and “hates people because of their skin color or ethnicity.” Everybody has a place somewhere on this continuum, and where you place yourself may not be where others who know you would place you.

One thing I learned at the DRRI, and still believe, is that you can’t grow up in a racist society such as ours, unaffected by racism. None of us are completely color blind. I’ve known many people who would reflexively deny having any racist beliefs or tendencies whatsoever, because they don’t understand the insidious nature of racism. To admit that you’ve inherited residual racist beliefs or inclinations doesn’t mean that you’re a bad person or, if you’re white, that you should feel guilty for being white. Another thing I learned at the DRRI was that guilt is a lousy motivator for change. Despite my personal history of ongoing self-examination and of actively opposing racism since I was a young man, I still can’t claim to be completely free of racism’s taint, myself.

In my DRRI training, I learned about both personal racism and institutional racism. I think that there are still a lot of good, well-intentioned white people who are blind to the institutional racism that still exists in our society; but in this post, I’ll only be writing about personal racism – specifically implicit bias and confirmation bias.

Bias is universal; it’s part of being human. It can be racial, cultural, religious, or political. Implicit bias is often reflexive, unconscious; and it’s not always necessarily a bad thing. I may have a bias for bland food or for spicy hot food, depending on the foods I grew up eating. This may mean that when I eat out, I’m not likely to try a new dish that the menu describes as spicy hot. It may mean that when I choose which movie I want to see at the cineplex, I’m more likely to choose a film whose protagonists resemble me, or who come from my culture. It’s easier to identify with people I see as being like me. It doesn’t mean that I’m racially prejudiced; it’s just my unconscious preference. Being a heterosexual, I may prefer a traditional romantic comedy over a gay-themed love story, even if I’m not homophobic. No matter your race or cultural identity or sexual orientation, you’re biased to choose one thing over another, based on your life experiences.

Confirmation bias is also universal, and usually unconscious. It means that if I’m given new information on a topic that I’ve already formed an opinion about, I’m more likely to believe and remember things that confirm what I already believe, and less likely to have my opinion changed by things that might challenge my belief.

Even if we bear no ill will to persons of a racial or ethnic group other than our own, our beliefs about them may be unconsciously influenced by common stereotypes attached to that group of people. When I lived in Germany, I observed that some of the same stereotypes that have been attributed to African Americans in our society were attached to Turkish “guestworkers” who lived in ethnic ghettos: they were lazy, stupid, untrustworthy, and all the men wanted to have sex with German women.

The biggest remaining fallacy that continues to fuel racial stereotyping is the idea that race is a biological phenomenon. The concept of race as we know it didn’t exist until the era of European colonialism. Race is a social construct designed to justify the exploitation, colonialization and enslavement of that segment of the human race identifiable by the darkness of their skin. Part of the concept is hierarchal: some races are superior to others. In fact, all human beings belong to the same race. If you go back far enough, we’re all kin.

So, now I question whether or not “race relations” is an outdated term, perpetuating the notion of different races. It seems to me that “intra-racial relating” might be more accurate in describing the sometimes troubled relations within the family of man.

Hip, cool, and woke

I got a BA in English before I got my MA in psychology, and I’ve always been fascinated by language. I learned that languages are – other than dead languages like Latin –  living things that evolve over time, to capture meaning and convey information. So,  my first point is that “hip,” “cool,” and “woke” are just words, with meanings that vary from person to person. What’s cool to you might not be cool to me.  But in my experience, many people use hip and cool interchangeably, and apply the term hip to places like coffee shops and to things that can be bought, like clothes or haircuts. This is a significant departure from the original  meaning, in which hip is a state of mind.

Cool is in the eye of the beholder, and it applies to people, places, things and actions: you can eat at a cool tavern, with cool friends, wearing cool clothes, and listening to cool jazz. Cool fads come and go, and the cachet of cool is used to move a lot of merchandise. Things can be made to seem cool by marketers and influencers. Certain things cannot be cool, like prisons.

Hip can’t be bought or rented or worn or inhabited, although marketers have used the word as an adjective, to be applied to this destination or that product. In its original meaning, hip can only be applied to persons. Dialectically speaking, you either are or are not hip – but it can also be seen as existing along a continuum. A synonym for hip is “aware,” as in “hip to what’s goin’ down.” “I’m hip” doesn’t  mean the same thing as “I’m cool”; it means “I understand.” To be hip is to be in the know, to see what un-hip people don’t see. Hip originated in Black dialect, because people of color tend to be aware of things that the majority of white people are blind to – as I once was. If you were hip, you kept your eye on what The Man was up to.

I wasn’t truly hip to American racism until, as an Army lieutenant, I attended the Defense Race Relations Institute (DRRI), to be trained as a race relations education officer. Sure, I had been aware of some aspects of racism before then. I knew that people of color were frequently discriminated against. Although I ‘d had Black classmates and teachers at the international school I’d attended in Vienna, my Georgia high school didn’t integrate racially until my Junior year. I hated racism and thought I was pretty well-informed about it.

But it wasn’t until my immersion in race relations education at the DRRI that I truly became “hip to what’s goin’ down” in America. Not only did I learn from classroom instruction, but in late night discussions in the barracks with brown- and black-skinned classmates (as well as a few Asians and Native Americans), who talked frankly about their own life experiences. We were the pilot class at the DRRI, and we felt a sense of brotherhood and trust. I became hip to the reality that white people live in a different America than people of color. I began to see things that I had been blind to.

Just as religious people can be guilty of “holier-than-thou” attitudes, it’s possible to fall into “hipper-than-thou” judgments. Hipness is perhaps best viewed as existing along a continuum, and where you place yourself on the Hipness Scale may not be where other hip people would place you. But it’s not a contest.

The concept of hipness seeped into white consciousness via the so-called Beat Generation, especially through the writings of Jack Kerouac. (He wrote that his definition of hip was someone who could score drugs in a foreign country.) The Beat Generation had a great influence on the Baby Boomer generation, and hipness was so central to the youth rebellion of the sixties that the long-haired, tie-dyed cultural rebels became known in the media as hippies. Not all of them liked the term, but the so-called hippies prided themselves in “knowin’ what’s goin’ down” and “dropping out”  of conventional society. They kept their eyes on what The Man was up to.

Playwright Lorraine Hansberry wrote, “There are no “squares” . . . Everyone is his own hipster.” What she meant by hipster was something entirely different from the contemporary meaning of the word, as I understand it. These days hipster seems to describe a style or a lifestyle and, to me, more resembles “cool” than the original meaning of hip. Perhaps the word “woke” is the contemporary analog of “hip to what’s goin’ down,” with a side of political correctness.

Values clarification

In order to rationally address the subject of values, I need to first examine the notion of absolute values. When I was a boy, I believed in certain absolute values; but as  a young man,  I began to question the concept of moral absolutes. Raised a Christian, I’d been told that the truth is always simple and, early-on, I liked that idea. But moral absolutes reduce the range of human choices to black or white, eliminating any shades of gray. That’s not the world I live in. Moral choices are more complicated than some people would have you believe.

The Ten Commandments are a classic example of moral absolutes. “Thou shalt not kill” is a moral absolute, and yet many people who profess the Ten Commandments as the basis of their moral code think that killing by soldiers in wartime is acceptable. Some “pro-life” people who believe that abortion is murder believe in capital punishment.

As an idealistic teenager, I got involved in the Sing Out America/Up With People organization, organizing local Sing Out casts in Georgia and South Carolina. Sing Out America was a promotional effort for the Moral  Re-Armament (MRA) movement. MRA  claimed to have a Western ideology to counter Communism, and promoted the idea of “absolute” honesty, purity, unselfishness and love. While this appealed to me at the time, I gradually became disillusioned with the MRA philosophy and the whole concept of absolute values. I  got comfortable with relativity and ambiguity in the determination of moral values.

I believe that values are bound to culture and circumstance. In primitive “subsistence economies,” where everyone has to carry their own weight in order for the tribe to survive, it’s understandable why an elderly or disabled person might  be expected to leave the tribe and die of exposure in the wilderness. In an economy of wealth, where more is produced than is needed for tribal survival, this practice is unnecessary, and would understandably be seen as cruel or inhuman.

So, I’m a believer in moral relativity. I believe that circumstances often determine what is “right” and what  is “wrong.” This moral philosophy has been called situation ethics – a concept attacked by religious zealots as a Satanic war on morality. The Republican Party has presented itself as the “party of values,” as if its values were absolute. In fact, everybody has values, from the Pope to gangsters like Tony Soprano. They just value different things.

Values clarification rises above the notion of absolute values and simplifies the moral equation with its specificity. Every moral stand involves a choice – it involves this over that. You either value your vow of fidelity to your spouse, or you value having sex with somebody else. You either treat people the way you want to be treated, or you sometimes steal from other people. You either value staying high on your favorite drug all the time, or you value a life of moderation and responsibility to the people who depend on you.

There are professed values and lived values, and we’ve all known  hypocrites who don’t live by the rules they say they believe in. The Bible says pretty clearly that rich people don’t go to Heaven, and yet there are many rich Christian fundamentalists who apparently believe that a camel can  go through the eye of a needle. Jesus didn’t say it would be easy to love your enemy, and your neighbor as yourself; but  I’ve known a lot of Christians who  don’t even try, although they give lip service to Jesus’ prescriptions. Organized religion is a breeding ground for hypocrisy, and I feel sure that there are plenty of Muslim, Jewish and Hindu (etc.) hypocrites.

Religious or not, many people lay claim to have the “right” values; but only moral absolutists can do this. Some of them just don’t think or care about the gap between their professed and lived values; others rationalize and equivocate, as with the Christian belief that we’re all Sinners, but that our belief in and love of God will save us from paying for our sins.

Socrates said that the unexamined life is not worth living. Voltaire said that doubt is a disagreeable state, but that certainty is a ridiculous one. Since I can’t make myself believe in the tenets of any particular religion – although most of my lived values are Judeo-Christian in origin – I remain a moral relativist. An existentialist at heart, I can live with ambiguity, uncertainty, shades of gray. Values clarification is a tool I can use to examine moral choices. My first marriage was polyamorous; but although I’ve been happily and monogamously married for thirty years to my wife Maria, I don’t necessarily view monogamy as morally superior to polyamory. The choice between the two is a matter of situational, lived values.

Psychotechnologies of Influence

I think that most people are unaware of the extent to which their beliefs and their daily choices are shaped by advertising and public relations, and the deceptive and manipulative  psychotechnologies they frequently employ. We’re so inundated every day by symbols and messages crafted by professional persuaders that their influence is largely invisible to most people. We’re all targets of corporate social engineers, and there wouldn’t be so many advertisements if they weren’t effective.

The “Father of Public Relations,” Edward Bernays, was a government propagandist during World War I. After the war, realizing that propaganda had peacetime applications, he re-named it public relations, and wrote the rulebooks for a new profession: the public relations counsel (in the sense of “legal counsel”). Bernays was a nephew and confidante of Sigmund Freud, whose teachings about subconscious influence were combined with the techniques of propaganda in such books as Crystalizing Public Opinion (1923) and Propaganda (1928).

Bernays wrote about “the possibilities of regimenting the public mind” and “the conscious and intelligent manipulation of the organized habits and opinions of the masses.” The practitioners of this new science of influence and persuasion, he wrote, “constitute an invisible government which is the true ruling power of our country. We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of.” Over the last century the propaganda industry (advertising, public relations and political consultancy) has become an indispensable part of both commerce and politics. You may never have heard of Edward Bernays, but he was one of LIFE magazine’s “100 Most Influential People of the Twentieth Century.”

Persuasive messages and campaigns that rely on logic and facts aren’t propaganda. Propaganda aims at the gut, not the brain, using deceptive and manipulative techniques to influence and persuade. The techniques of propaganda aren’t the only weapons in the arsenal of the propaganda industry. Rhetorical devices, symbol manipulation, heuristics, and psychological learning theory – specifically classical (Pavlovian) conditioning and operant conditioning – are among the psychotechnologies  of influence and persuasion utilized by propagandists. I’ll write about some of these tricks of the trade in Part 2, but I’ll first  name and describe the classic techniques of propaganda. Most of these techniques were identified by the Institute for Propaganda Analysis, a public interest group in the thirties whos stated goal was to “teach people how to think (independently), not what to think.”

Probably the most common propaganda technique is assertion: stating an opinion as if it were a fact. Assertions range from outright lies to cleverly-worded messages with no objective factual basis. If you qualify a stated belief with “I think,” “it seems to me,” or “in my opinion,” it’s not propaganda. President Reagan’s famous  statement that “government is the problem” is a classic example of assertion. Another of the most frequently used propaganda techniques is ad nauseam – the endless repetition of assertions, slogans, or advertising jingles. A phrase attributed to Hitler’s Minister of Propaganda, Joseph Goebbels, is that “a lie repeated a thousand times becomes the truth.”

Transfer is a term for creating an association, positive or negative, between two unrelated things. (From a psychological point of view, transfer involves classical conditioning.) Using an American flag as a backdrop for a political message is an example of positive transfer. A background visual of burning stacks of money is an example of negative transfer. Bandwagon suggests that we should be on the winning side and avoid being left behind with the losers: “Everybody knows that’s the truth” or “for those who think young.”

Other propaganda deceptions include lies of omission, card-stacking, and distortion, where facts are cherry-picked to promote the message, and any contrary facts are omitted or misrepresented; or involving an insidious mixture of facts and outright lies; or half-truths, where facts are blended with assertions. Glittering generalities like “national honor” and “best country in the world” are subjective and have no objective basis for definition. Name-calling attempts to reduce a person to a label. With ad hominem, the messenger is attacked, to distract from the message, i.e. “You can’t trust anything he says.” Testimonial and appeal to authority attempt to link  the message with an admired person or authority, whether Abraham Lincoln  or “nine-out-of-ten dentists.” Celebrity endorsements  also fall into this category.

Simplification and pinpointing the enemy offer simple explanations for complex issues and propose a culprit for an identified problem, as in Hitler’s scapegoating of the Jews. Appeal to fear and stereotyping also belong to this cluster of techniques – favored by demagogues and xenophobes – and are self-explanatory. The black & white fallacy is also related: if you’re not with us, you’re against us. There’s no middle ground.

The result of a successful propaganda campaign  is ignorance or deception on a mass scale. If this post has stimulated your curiosity  about psychotechnologies and corporate social engineering, I’ve written a book about it: Ad Nauseam: How Advertising and Public Relations Changed Everything – available in paperback online, or as an e-book.

The Great Secret

In a previous post I mentioned the Great Secret. My fictional protagonist – a man on a quest for Meaning – found a book with the title The Great Secret, only to discover that the pages were blank. I actually have a book titled The Great Secret, written in 1922. Its Flemish author (who wrote in French), Maurice Maeterlinck, won a Nobel Prize in Literature for his poems, plays and essays. Despite its having been written almost a century ago, it’s well-researched and still provides a valuable guide to the tradition of the Great Secret. What attracts me to the notion of the Great Secret is my sense that we all live at the heart of a Mystery: what is life? what is consciousness? what does it all mean?

Maeterlinck studied the Vedic (Hindu) tradition of India, Egyptian religion, Zoroastrianism, Greek mystery schools, Buddhism, Jewish Cabalists, Gnostics, Neoplatonists, and alchemists. I don’t think modern scholars can add much to what Maeterlinck learned about  these ancient wisdom traditions. He was no starry-eyed True Believer, but a  thorough and objective scholar. It’s been said that “those who know don’t tell; those who tell don’t know.” Maeterlinck doesn’t offer a definitive answer to the question, “what is the Great Secret?” but shares what he’d learned from years of study – food for thought. His succinct conclusion is mysterious, not definitive – as you will see at the end of this post.

There were many mystery religions and cults in the ancient world. They often had an outer circle of adherents who were given one set of teachings (exoteric knowledge), and an inner circle of initiates to whom the Great Secret (esoteric knowledge) had been revealed. This knowledge challenged the conventions of the outer circle, often in a shocking way. Imagine growing up believing that God is a male, only to be told by the high priest at your initiation that God is actually female. Maeterlinck suggests that what was whispered in the ear of Egyptian initiates was, “Osiris is a dark god.”

One of the core tenets of the Vedic tradition is that all things are one thing: Brahman. Maya – the veil of illusion – keeps us from knowing our identity with all  things. (The greeting/blessing “namaste” is  an acknowledgement of the divinity of the person being greeted.) This idea of the unity of all things can also be found in other ancient mystical sects, as well as in some modern mystical philosophies. A common thread in various mystical teachings/traditions is that if all things are one thing, then you don’t have to go outside of yourself to discover The Truth. The macrocosm is contained in the microcosm.

Saint Francis of Assisi wrote, “What you are looking for is what’s looking.” Rudolph Steiner, founder of Anthroposophy, wrote “It is in the soul that the meaning of the universe is revealed.” Maurice Maeterlinck put it this way: “It is in you yourself that (God) is hidden and it is in you yourself that you must find him.” This echoes the beliefs of such Christian mystics as Meister Eckhart, as well as mystical Christian, Jewish and Muslim sects.

Another common thread in mystical traditions is that the Great Secret is something to be experienced, not understood. Mystics do not seek contact with, or  knowledge from, the Divine; they seek union. A common belief in mystical traditions is that “the vessel must be prepared” to hold the wine of revelation. Sometimes the preparation involves an ordeal of some kind. Other times it means practicing a discipline, such as  meditation or asceticism. But at the very least it means emptying  your cup of your old beliefs, so that new wine can replace the old. A Sufi saying has it that “when the student is ready, the teacher will appear.”

In his novel Zorba the Greek Nikos Kazantzakis wrote, “Everything has two meanings, one manifest, one hidden. The common people comprehend only what is manifest.” Maeterlinck wrote, “Humanity has need of the infinite.” His best summation of his thesis is, “The Great Secret, the only secret, is that all things are secret.” Go figure.

Plutophilia – a proposed diagnosis

Psycho-diagnostics are culture-bound. The “Bible” of psychodiagnosis in this country is the Diagnostic and Statistical Manual of the American Psychiatric Association (DSM), and from time to time a committee of psychiatrists updates it. The current edition is DSM 5. In DSM 2, homosexuality was classified as a mental disorder, but this error was corrected in the next edition. The DSM 3 also eliminated the “neurotic disorders” listed in the prior editions. What used to be called Multiple Personality Disorder is now called Dissociative Identity Disorder. Some diagnoses have a limited lifespan.

Each diagnosis establishes multiple criteria (e.g.descriptions of symptoms), a certain number of which have to be met in order to establish the diagnosis as accurate. Psycho-diagnostics isn’t rocket science. It’s often imprecise, and relies more on theories than on verifiable data. Unlike most physical disorders, there are no biological markers to distinguish (for instance) Schizophrenia from Schizoaffective Disorder or Bipolar Disorder, manic. Much psychodiagnosis is educated guessing. The criteria for what’s considered psychopathology are values- and culture-bound, and sometimes arbitrary.

Mental illnesses exist in other cultures that aren’t found in the DSM.  Amok  is a mental disorder that occurs in Malaysia, Indonesia, and Polynesia, where people (mostly men) go berserk and assault anyone in their path. Koro is a persistent anxiety state that manifests in some men in Southeast Asia, based on their belief that their penis is shrinking, or retracting into the body, and that this can lead to death. Susto is a belief in “soul-loss” in some Hispanic cultures, which is believed to cause vulnerability to a variety of illnesses. A lot of people around the world believe in illnesses caused by voodoo/obeah/root magic hexes or spells, or the “evil eye.”

Having stated that psychodiagnosis is somewhat arbitrary and culture-bound, I’ll try to make the case for a new diagnosis that is bound, not to an ethnic or national culture, but to the multinational corporate culture. Only the very rich can develop this pathology. I believe  that there are cultural, economic, and political reasons why Plutophilia – excessive love of wealth –  isn’t a recognized  “paraphilia,” alongside necrophilia and  pedophilia. (Plutophobia – fear of wealth or money – is believed by some clinicians to be  a treatable psychopathology.) According to the Bible, it’s not money, but the love of money that’s the root of all evil.

Here are my suggested diagnostic criteria for a diagnosis of Plutophilia: (1) Obsession with the endless accumulation of wealth, far beyond what is needed or will be spent in a lifetime; and persistent or compulsive behaviors in the service of wealth accumulation. (2) Compulsive competition with other plutophiles in amassing the greater/greatest fortune. (3) Unconcern with the negative economic, social, and ecological consequences of their exploitation of workers and/or other resources, and of their obsessive profiteering. (4) Delusional belief in their (social Darwinistic) superiority as human beings, and in having “earned every dollar.” (5) Insatiability. No matter how much wealth is accumulated, it’s never enough. (6) The belief that their psychopathology  is a virtue. I’d say that meeting five of these six criteria would suffice to establish the diagnosis.

Plutophilia is responsible for the vast gap between the wealthiest few and the masses that live in, or on the edge of, poverty. It harms society as surely as an unending drug abuse epidemic. However, having the disorder can’t be the grounds for involuntary commitment and/or court-ordered treatment. Sadly, there is no known treatment or cure.

Existentialism and psychotherapy

Although I studied a variety of therapies in my preparation for a career as a psychotherapist, I never identified exclusively with one approach – gestalt, client-centered, behavioral, psychodynamic – as a descriptor of my style of therapy. I was an eclectic practitioner, but have always considered my therapeutic orientation to be existential.

I respect that there are therapists whose work has a religious foundation, but mine was a secular practice. I validated faith in God and prayer as best I could, with clients who found meaning in their religious beliefs; but if clients asked me to pray with them, I declined. Although I was raised as a Christian, and most of my values are rooted in the Judeo-Christian ethic, I’m an agnostic of the kind that’s very comfortable with saying “I don’t know” when asked about specific religious beliefs. I think that it’s just as arrogant for an atheist to assert sure knowledge that there is no God as it is for a religious person to assert that I’m in error for not believing what they believe. Define God, then we can talk.

I don’t believe that I have the authority to definitively answer questions about religion and am tolerant of  those who claim to “know” that their beliefs are true, as long as they do no harm as a result of religious beliefs. Of course, there’s considerable room for debate about what constitutes harm. (I personally consider any form of indoctrination to be harmful.)  I consider myself an existentialist because existentialism directly addresses morality and personal responsibility, without the excess baggage of sin and redemption and pleasing God. I’ll briefly summarize some of the basic principles of existentialism, as I understand them.

First, existentialism asserts that there’s no universal Meaning “out there” that all right-thinking people can apprehend – as opposed to religions, which assert that there is, i.e. “God’s plan.” To existentialists, concepts like Sin and Redemption and Divine Intercession are constructs based on religious doctrine. They don’t exist in any objective sense. Meaning only exists in the eye of the beholder. Life is absurd, as illustrated by Albert Camus in “The Myth of Sisyphus.”  Sisyphus continues to push the boulder up the hill, despite knowing that it will just roll back down. He persists, despite the absurdity of his efforts, because the act has meaning for him.

Because there are no absolute rules, or Divine rewards or punishments in an afterlife, we are each free to do whatever we want. But the other side of the coin of freedom is responsibility. We’re absolutely responsible for whatever we choose to do, and can choose to behave morally even if we don’t believe in Heaven and Hell. We can choose to live in good faith with others, because of our moral responsibility for all of our actions. Although we can find joy and meaning in authentic relationships, we’re all essentially alone in our lives. (A song sung by Country singer Bill Monroe expresses this as well as anything I’ve read on the subject; “You’ve got to walk that lonesome valley,/ You’ve got to walk it by yourself,/ ‘Cause nobody else can walk it for you./ You’ve got to walk it by yourself.”) We each have to deal with Angst (anxiety) and dread that comes from the knowledge that we will someday cease to exist. Existentialists don’t rely on the comfort of religious promises of eternal life for the faithful, to come to terms with our mortality.

To say that there’s no objective Meaning to existence “out there” isn’t to say that meaning is unimportant. As an existentialist I’m free (like Sisyphus) to find, or create, my own meaning. One of the best-known existential therapists, Viktor Frankl, named his school of psychotherapy logotherapy – from the Greek “logos”: meaning, or reason. (I’ve written about Frankl in previous posts. I’ve recommended his book, Man’s Search for Meaning, to more clients over the years than any other book.) Although I didn’t practice logotherapy, per se, I’ve worked with many therapy clients to help them find or create meaning in their lives. It can be a life-or-death matter with people who are suicidal.

I initially saw existentialism as grim and forbidding: if there’s no extrinsic Meaning to existence, then all we can do is to sweat along with Sisyphus, acting as if there was meaning to our lives. But now I see the richness of choice, where I once saw austerity. Existentialism gave me a philosophical context for the I-Thou encounters of psychotherapy. We all have a need for our lives to mean something; but we needn’t rely on “God’s plan,” as taught by this or that religion, or on promises of eternal life, to find meaning in our lives.

If you want to learn more about existentialism and the colorful characters (Jean-Paul Sartre, Simone de Beauvoir, as well as Camus, Heidegger and Merleau-Ponty) who formulated its principles, I recommend Sarah Bakewell’s highly-readable At the Existentialist Café: Freedom, Being and Apricot Cocktails. I’d never have guessed that phenomenologist Maurice Merleau-Ponty was good at dancing the Jitterbug.

My love story

I fell in love with my first wife, Doris, as an Army lieutenant serving in Germany. She’s a German who spoke fluent English when I met her. I still spoke some German, having lived in Vienna  for four years, when I was in middle school. Doris was my first lover, still living with her parents; but we soon found an apartment and moved in together. Believing that we were destined to grow old together, we got married – much to the relief of her parents. I had a Top Secret security clearance and had to get permission from the Army to marry a foreign national.

We had an unconventional marriage. Although we were in love, we discovered that neither of us was reflexively jealous, or a conventional thinker. We read the popular book Open Marriage, by Nena and George O’Neill to one another, and discussed the option of having other lovers. Some open marriages are monogamous. Only one chapter of the book is about the option of having other lovers, and the O’Neills  expressed their dismay at the wide perception that open marriage automatically meant polyamory. Open marriage is a concept much larger. It’s a concept – radical for its day – of the marriage bond as an ongoing choice to be together in a loving, committed relationship, as equals. We wrote our own wedding vows, pledging to be together “as long as we both shall love.”

During the six years of our marriage, Doris and I each had other lovers, and jealousy wasn’t a problem for either of us. We usually got to know and like one another’s lovers, and there was never any secrecy or deception. We never considered ourselves “swingers,” and we didn’t seek out strangers to have sex with. It’s just that we had the option to become lovers with some of our opposite-sex friends. When we divorced, amicably, it wasn’t due to our polyamory. We remained friends and, for several years, lovers. Even after the divorce, my parents still treated Doris as a family member. She lives in Germany now, and we remain close friends to this day.  My wife Maria and I visited her two years ago.

Doris came to visit me in Beaufort, SC after our divorce and, to my delight, she and Maria hit it off right away. Maria and I were lovers, but at that point we were seeing other people, too. Years later, Doris told me that she’d regretted our divorce, and had hoped to “win me back.” But when she saw how Maria and I were together, she knew that we were meant for each other. Despite this discovery, she was never jealous of Maria, and over the years they’ve come to be like sisters. They have three things in common: each is a lovely woman, they are among the most honest people I’ve ever known, and neither of them has a mean bone in her body.

I knew from early-on in our relationship that I wanted to marry Maria. She had just divorced her husband of seventeen years, and I knew that she needed to “play the field” before she might decide that she wanted to spend her life with me. (I feared that I might turn out to be her “transitional man.”) She knew that Doris and I had had a polyamorous open marriage but she knew, once she decided to marry me, that she wanted our marriage to be monogamous. We’ve both been faithful to one another since the day we agreed to marry, and I’ve never regretted giving up a polyamorous lifestyle. I consider our marriage to be an open marriage because we’re autonomous equals, neither of us tries to control or dominate the other, and we’ve been together in a committed relationship for thirty years out of loving choice, not momentum or a sense of obligation.

For a polyamorous open marriage to work, both partners have to want it, to trust one another and be trustworthy, and to always remember that theirs is the primary relationship in both lives. I have friends who have had a polyamorous open marriage for over forty years. It’s my belief that polyamory can be a valid choice in a loving marriage. While most marriages probably need to be grounded on a pledge of marital fidelity, I don’t consider monogamy to be morally superior to polyamory. It’s just a choice that some loving couples can make.

I don’t miss having other lovers because I’m happily married to the love of my life, who isn’t polyamorously inclined. But I can still tell Doris that I love her, over the phone or in person, and Maria isn’t jealous. Doris and I get along better as brother and sister than we did as a married couple. I have the good fortune of still being a close friend to my first lover, and one of the great joys of my life is to see and hear Doris and Maria talking and laughing together. Even if they’re laughing about me.

Suicide prevention

While the act of suicide is sometimes a long-considered, planned option which nobody can prevent, most suicide attempts are impulsive. According to one study, approximately one quarter of the people who try to kill themselves do so within five minutes of their decision to attempt suicide. Only a small fraction of people who survive a suicide attempt go on to die by their own hand. Throughout my career as a psychologist, I assessed many people shortly after a suicide attempt. A question I always asked of them was, “Are you glad that you’re still alive?” Almost all of them were glad that their suicide attempts had failed. I concluded that most suicide attempts are mood-specific behaviors, often involving intoxication on alcohol or other drugs. Once their mood changes, or they sober up, they no longer want to end their lives.

While in grad school, I volunteered as a telephone crisis hotline worker. I was trained to talk to people who were in crisis, to keep them from engaging in attempts to harm themselves or others. From early in my clinical practice I was called on to evaluate the suicide potential of clients. I learned that many people who attempt suicide are ambivalent about living. “To be, or not to be; that is the question.” At the core of this ambivalence is the issue of existential meaning.

One of the major existential therapists of the twentieth century was Viktor Frankl, an Austrian psychiatrist that I’ve written about in previous posts. His book Man’s Search for Meaning was based on his experiences as a survivor of a Nazi death camp. He observed that in such a hellish environment, those who fought to live were people who had a sense of meaning in their lives. He called his method of psychotherapy logotherapy (logos means “reason” or “plan” in Greek), and his therapeutic approach was to help patients find, or create, meaning in their lives.

Lives bereft of meaning are empty lives, but sometimes the vacuum can be filled. Although I was able to help some suicidal clients to find something to live for, one of my severely depressed therapy clients died by his own hand. It was the worst thing that happened in my career. I really liked “Allen,” saw strengths and personal qualities that he couldn’t see, and worked in therapy to help him find reasons to go on living. I saw him on Wednesday afternoons, and he always kept his appointments. When he didn’t come in one Wednesday, I immediately called his apartment. When he didn’t answer after several tries, I looked up his address and drove to his apartment. When he didn’t come to the door when I knocked and rang the bell, I intuited that he was dead, inside. Sadly, this proved to be the case. It turned out that he’d bought a gun that morning, gone home, and used it. On a Wednesday, instead of keeping his therapy appointment.

I went through predictable self-recriminations and judgments. Could I have done anything differently that would have prevented his suicide? But I recognized this as a question that could never be answered. My colleagues knew that I was grieving as if I’d lost a family member, and supported me in my grief process. A peer review of my clinical records found that I’d done and documented everything properly, in terms of recognizing and dealing with Allen’s suicide risk.

A few years ago a close friend committed suicide. She suffered from bipolar disorder, and had confided in Maria and me that she would take a drug overdose in certain future hypothetical situations. She said it matter-of-factly, and wasn’t depressed when she said it. We knew that there was nothing we could say that would change her mind. We hoped that she’d never find herself in one of those imagined situations.

Philosophically, I’m torn on the issue of the “right to die,” because if suicide were to be legalized, it’s inevitable that some depressed people would convince themselves – or be convinced by others – that it was their duty to die, perhaps because they felt useless, or they wanted to leave an inheritance, rather than spend their money on their own medical care in old age. I’m no longer a therapist, but if I knew that someone was acutely suicidal, I’d do whatever I could to try to prevent an impulsive suicide attempt. (Many times, as a Designated Examiner in the Probate Court, I recommended involuntary hospitalization for suicidal people.) But once a person has suicided, I don’t make judgments about their decision to end their life. I don’t have the authority to judge.

Most people who end their own lives do it to escape intolerable pain – whether physical or emotional. Allen killed himself because he could no longer endure living with severe depression. His life had no meaning worth living for. I tried unsuccessfully to help him find reasons to live. Albert Camus considered suicide to be “the fundamental question of philosophy.” He wrote, “I see many people die because they judge that life is not worth living. . . . I therefore consider that the meaning of life is the most urgent of questions.”

Which takes us back to Viktor Frankl, who found meaning in the Hell of a Nazi death camp, survived, and went on to be a founder of the humanistic psychology movement.