What You Should Think About Organized Religion

December 25, 2007

“Let he who is without sin cast the first stone.”

–Jesus Christ

“It is impossible to prove a negative,” is a statement far from insightful. For example, “this essay is not written in the Klingon language,” is a negative statement that can be verified as well as any practical standard of proof would require. Proof of negative assertions becomes more problematic as discussions move from the specific to the general. “There are no polka-dotted swans,” is an eminently likely proposition. However, it is possible to remain reasonable while taking the position that the best available proof merely establishes that a polka-dotted swan is an extremely improbable phenomenon.

When it comes to belief in an omnipotent being, a negative position is even more difficult to prove. Not only is it plausible to argue that such a being could defy any efforts at detection, but there is even a case to be made that an omnipotent being would not be constrained by logic. For absolute atheists, these conditions are problematic. Of course, monotheists are challenged just as strongly by the inability to prove that there are not multiple omnipotent beings.  Then consider the challenges of proving that their specific concept of a supreme being is a generally accurate reflection of reality.

Some people reach their own conclusions about matters of the divine. Yet many more allow their beliefs to be shaped by cultural traditions or even the dogma of religious institutions. This can be extremely problematic. Among other things, the embrace of organized religion tends to promote an unhealthy sort of inflexibility. This often stems from the perception that beliefs promoted as ancient wisdom are largely consistent with actual ancient beliefs. Yet is that perception justified?

Never mind variations in the content of sacred literature from one era or even one century to the next. Applications of religious thought consistently change to remain compatible with underlying social conditions. Excessive delay in this process simply results in a popular movement away from old faiths in order to embrace younger traditions. An honest study of religious history turns up all manner of examples where a faith that failed to speak to the great questions of the day yielded popular support to new spiritual movements eager to address those questions.

Even within a particular faith, there may be tremendous change over time. In the Middle Ages, Christian organizations actually ran brothels, not to mention encouraging priests to marry. It was only after being challenged on the practice of selling indulgences, an issue that helped bring about the Protestant Reformation, that the Holy See sought to demand chastity among all orders of clergy. Up to and during the American Civil War, some Protestant churches taught that God had ordained white hegemony and black slavery. Today some of those same pulpits are used to advance the argument that God demands equal treatment for all races.

Secular thinkers sometimes unfairly criticize religion for being unable to change with the times. Science may have produced flawed understandings of reality, but it does so in a context of focusing on empirical evidence. Setting aside pseudoscience like global warming denial or “creation science,” real science is driven to change not by passions or politics, but by data that satisfies reasonable standards of proof. Even wild new ideas can be quickly adopted by science if they can be supported by hard evidence.

By contrast, social change and personal whims are the driving forces behind change in religious thought. The popularity of a belief about the natural world is not a factor in how much it is accepted by scientists. In recent history, attitudes about race, gender, and sexual preference have, and continue to, bring about change in religious practices and teachings. Looking back further, changing attitudes about government, sexuality, violence, and a host of other issues have left their mark on the ways of modern faiths.

Nearly all adherents to the teachings of an organized faith arrived at those beliefs by traveling one of two paths. The most common is inheritance. Early in life, perhaps even from infancy, a person may become immersed in rituals and indoctrinated in religious teachings. Rather than forming the capacity for sound judgement then pursuing answers to questions of theology and morality, a personal attachment to a particular set of answers is firmly imprinted on pliable young minds.

In other instances, faith is the product of experiences that coincide with an intense episode of personal distress. As emotions impair rational judgement, the wholehearted embrace of a new worldview (not to mention entering a new social circle,) can provide relief and support in a time of crisis. Sometimes the mechanism resembles a one-two punch as childhood immersion in a specific organized faith produces a sense of comfort in religious association that is reinforced by subsequent refuge provided by a religious rebirth.

Religious belief is not a uniformly pernicious influence. It provides real comfort to real people facing real problems. It can provide a sense of togetherness in times of increasing individuality and social isolation. It may even increase the intensity of the good feelings associated with personal triumphs or significant milestones in life. Perhaps other institutions and practices could serve these same needs. Yet it is hard to argue that, if all religious practice suddenly ceased, nothing worthwhile would be lost to humanity

Of course, religious belief is not a uniformly positive influence either. Different faiths offer different teachings. Many of these faiths teach that others are false. In some instances, religious leaders actively promote hatred of human beings associated with different faiths. In fact, the condemnation of difference may even involve extremely violent struggles over relatively subtle theological distinctions. When a difference of opinion emerges among scientific thinkers, observation and analysis are decisive. When a difference of opinion emerges among religious thinkers, sheer force of advocacy is the decisive factor, as empirical evidence is rarely available (and often marginalized when it is available.)

A measure of faith can be useful as an alternative to being consumed by the complexities of resolving all moral issues or surrendering to nihilism. Yet faith is counterproductive to the degree that it straightjackets ethical thought in hallowed, yet ultimately arbitrary, human doctrines. Perhaps no capacity for belief is more important than the capacity to believe in one’s own ability to have faith in erroneous conclusions. Whether the context is secular or religious, that capacity is essential to remaining in touch with reality and adapting to new information as personal growth, new experiences, and fresh discoveries provide access to increased knowledge.

In theory, participation in an organized religion may be just as harmless as participation in a social club. In practice, participation in an organized religion may be just as harmful as involvement with the most destructive political movements. If you are involved in such a faith, and you manage to take away from it only messages of love, peace, goodwill, tolerance, humility, etc.; then you may benefit from that involvement. Yet if such involvement also generates ill will toward your fellow human beings, compelling reason exists to recognize the flaws of any teachings or practices that add fuel to the fires of hatred.

What You Should Think About Existentialism

November 30, 2007

“In the struggle between yourself and the world, side with the world.”

–Franz Kafka

In some circles, existential philosophy has the reputation of an angry teenager. Yet that reputation is different in crucial ways from the “nobody understands me” cliché of adolescence. The trials and tribulations of a typical experience with puberty are made to seem more intense by a host of physical and chemical changes. Little by little, young people must cope with a range of adult issues for the very first time. Like teenage acne, teenage angst is unsightly yet also perfectly natural and well understood by adult outsiders, most having endured it themselves.

By contrast, many critiques of existentialism do not stem from any sort of genuine understanding. It is one thing to have passing encounters with notions like individualism and uncertainty. It is a much different thing to delve into the profundities of the human condition without any ideological safety blanket. Many are the clumsy critics, mauling great works of existentialist thought with interpretations bereft of nuance. Rather than embark on a lifelong journey of learning and personal growth, they wallpaper over great mysteries with conformity and faith.

Understanding existentialism begins by understanding the futility of asserting useful absolute knowledge. We can only be ourselves. Even much of what we know of ourselves comes through flawed perceptions and imperfect communications. All the knowledge we possess of entities beyond ourselves is also a product of those perceptions and communications. Then there is the ever-present prospect of faulty inference.

To uphold any teaching as beyond dispute is to assert inhuman perfection exists within human belief. Yet this process does not end where it begins. Accepting the general limits of human understanding is a major step toward transcending the limits of any specific tradition or doctrine. Insofar as existentialists have any particular aim, it is to liberate the human mind from the circumstance of life as a moral marionette. However uncomfortable a question with no answer may be, it has clear advantages over dedicated entanglement in the threads of popular false narratives.

When existentialist ideas were emerging in the 19th century, even ivory towers were populated predominantly by people convinced that questions of morality yielded to certain answers rooted in traditional beliefs. To people firmly anchored in a particular religious or cultural worldview, it is unpleasant to confront the suggestion that life is packed with unknowns and unknowables. From Apollo’s chariot to literal interpretations of Genesis, it seems human nature to favor even outright implausible narratives over comfortable coexistence with the unknown.

Much of existentialist thought is concerned with philosophical deconstruction. This is no haphazard obliteration of all that has come before. Martin Heidegger, among others, favored the term “abbau.” Perhaps the best metaphor for this process involves the architecture of a growing city. To deliberately level the entire place would be enormously harmful. Yet selective demolition of edifices that are not useful in the present is an essential activity that clears space for new projects that serve new needs.

Abbau offers us a minor paradox in that it is at once destructive and constructive. Just as decrepit brick buildings are best dismantled to make way for towers of glass and steel, invalid or obsolete ways of thinking are best abandoned so as to make way for more realistic and useful beliefs. It is a creative form of destruction, as the absence of dogma and falsehoods is itself a phenomenon worthy of creation. It also facilitates further constructiveness to the degree that accepting uncertainty establishes a foundation for later acceptance of novel information.

Existentialists are often accused of discarding all of tradition in order to embrace amoralism or nihilism. Yet this accusation can only be born from some simple-minded interpretation of philosophy. If anything, existentialists encourage the pursuit of knowledge about other moral and philosophical beliefs. After all, it is dogmatic thinking that causes that the vast majority of human thought to be discarded as heterodox. It becomes much less difficult to assimilate the vast diversity of worthwhile human wisdom after recognizing the profound limitations of all human wisdom, including those beliefs one holds most dear.

Centuries earlier, the dawn of astrophysics prompted ecclesiastical authorities to persecute, even kill, people guilty of no greater heresy than challenging official church doctrine on the nature of heavenly objects. Thus it should come as no surprise that existentialist writings condemning absolute faith in religious morality provoked, and in some circles continue to elicit, incendiary hostility from devout worshipers. The rise of secular governance, especially Western civilization’s embrace of free speech as a human right, protected men like Friedrich Nietzsche and Søren Kierkegaard from state-sanctioned reprisals for controversial publications.

Those two individuals have a peculiar part to play in the story of existentialism’s rise. Both struggled with inner demons even as they displayed outright genius in the analysis of human morality. If there is any real link between nihilistic brooding and existentialist philosophy, it is not in the actual message of existentialist philosophers but rather in the darkest moments of human drama endured by its pioneers.

Neither of them actually espoused nihilism. Kierkegaard was a devout Christian who observed (as is also apparent today) that a vast gulf divides the teachings of Jesus from the deeds of those most vocal about acting in his name. Yet Kierkegaard was also an existentialist. He granted that his faith was a personal choice rather than a logical conclusion, and he never lost touch with an inner struggle between faith and doubt.

By contrast, Nietzsche leveled many powerful broadsides at the core of religion. His command of religious history conspired with a rapier wit to make his works especially provocative. Even as he wrote about the folly of being certain in beliefs, his literary voice conveyed a merry prankster’s boldness. Traditional thinkers were insulted enough to see sacred teachings linked to the ancient myths from which they were so clearly derived. Adding ridicule to the mix helped to shake some readers out of mental malaise even as it afflicted some critics with obsessive hostility.

To some degree there is a link between Buddhism and existentialism. Some Buddhist teachings promote stark honesty regarding the human condition.  Others emphasize the importance of arriving at beliefs as a continuous process of searching for personal enlightenment transcendent of any established doctrine.

Yet existentialism is no religion. In fact, it actively discourages the kind of orthodoxy that comes with most organized religious activity. The central lesson existentialism teaches regarding religion is that whatever wisdom priests and scripture may contain should be given due consideration right alongside wisdom that contradicts the assertions of clerics and holy texts. The search for insight is also a search for the will to let go of the false security provided by attachments to tradition, faith, conformity, nationality, etc.

Existentialism does not offer a path to the easy satisfaction of transcending doubts. This is good, because that easy satisfaction is the progenitor of dangerous zeal. By acknowledging that the human condition simply does not permit an absolute escape from the unknown, existentialism offers a means to become comfortable with abundant mystery. It shines light on the illusory nature of the comforts of dogmatic belief. By acknowledging the real limits of human knowledge, the stage is set for a rebellion against tradition.

Through this process of rebellion, guided by awareness of human limitations, it becomes possible to constantly refine one’s own beliefs, moral and otherwise. Few people find it controversial to assert that lifelong learning is better than settling for an outlook firmly fixed long before life’s end. Yet few also understand just why and how an adaptive personal approach to morality has more to offer than an inflexible doctrinal approach.

Existentialist philosophy offers a long, and occasionally absurd, journey to the frontiers of human understanding. Still, it seems unsound to me to avoid this journey. Attributing infallibility to any particular tradition or teaching can only retard personal moral growth. If there actually was a creative thought process driving the birth of the universe or the development of its inhabitants, it seems clear that this process left human beings with the capacity to think for ourselves. With or without a God watching over us, it seems better to exercise that capacity for moral reasoning than to settle for uncritical adherence to beliefs promoted by others.

What You Should Think About Skepticism

November 24, 2007

“Just think of the tragedy of teaching children not to doubt.”

–Clarence Darrow

By the time the ancient Greeks took to formalizing thoughts on belief, they also managed to formalize thinking on doubt. An influential thinker from Elis named Pyrrho managed to witness firsthand many conquests of Alexander the Great. Some might argue that this association caused later scholars to place undue emphasis on Pyrrho’s legacy. Yet it was at least worthy of some note.

The man wrote no great philosophical work, but as with Socrates his students would boast of their association and labor to recall Pyrrho in his own words. This leads to historical accounts that blend earnest recollection with distortions meant to serve the agenda of philosophers promoting their own ideas. Still, it seems clear that the heart of Pyrrho’s teachings was that uncertainty is sound and right in ways that certainty cannot be.

His aim has been characterized as “emotional tranquility,” and he advocated suspension of belief. To him an ideal state of mind, given the term ataraxia, involved having no beliefs. Of course there is an amusing contradiction here. How does one pursue this ideal of having no beliefs without holding the belief that it is an ideal state of mind?

One account mocks Pyrrho as requiring the constant attention of handlers to prevent him from walking off cliffs or stepping in front of horsecarts due to an inability place stock in his own perceptions. In reality the man was almost certainly more reasonable. If his teachings were not as severe as the most extreme account, then they too may be thought of as a reasonable response to the problems of argumentative acrimony and divisive conflict.

In a world where life could be taken at the whim of a leader, flexibility in belief offers some survival value. In a world where political disagreements may tear a society apart, flexibility in belief may insulate an individual from the anxiety and pain of being an active partisan. Yet there seems to have been more to Pyrrho’s teachings than this. He laid the foundation for recognizing just how loosely beliefs may be anchored in reality.

From the ancient past to the modern world, this complex relationship has been the subject of much discourse. Robert Nozick was never shy about considering “brain in a vat” scenarios. Since all we know of the universe comes to us through our perceptions, there is no way to establish with absolute metaphysical certainty that our experiences are not part of some simulation that eclipses an underlying reality in which the individual is actually a brain in some alien stimulation and life support system.

It is fair to argue that what we know, and even beliefs about what we are, follow from the imperfect results of perception and analysis. Our senses and our minds can “play tricks on us,” leading to beliefs that depart considerably from what is real. Yet this sort of thinking can be taken too far. I favor use of the phrase “skeptics’ infinite regress” to describe this phenomenon in which a retreat from the very prospect of useful knowledge is fueled by contemplation of possibilities that are supported chiefly by appeals to the limitations of evidence . . . as opposed to something like evidence itself.

Walking off cliffs or into traffic because we doubt the reality of those phenomena seems as sure to be foolish as the term “foolish” is sure to be meaningful. I believe the greatest extremes of skepticism can rightly be pigeonholed as philosophical novelties rather than essential insights. Yet the broader phenomenon is clearly not useless. Just as an inability to believe would be crippling, so too would be an inability to doubt.

Belief and doubt are both fundamental phenomena that shape the way thinking beings relate to the world around them. The ultimate value of this thinking is heavily influenced by the degree to which belief and doubt are used to seek truth (or the degree to which they are used to avoid it.) For example, someone with a deep emotional connection to a particular political perspective on scientific question may level doubt at even the most rational analysis while eagerly offering up belief whenever it provides an opportunity to confirm a predisposition.

To use the term “skeptic” while engaged in irrational defense of a long-held viewpoint is somewhat misleading. This behavior is less an exercise of true skepticism as it is an exhibition of passionate belief. Mainstream perspectives on the 9/11 attacks, global warming, evolutionary biology, etc. are met with a great deal of “skepticism” but very little of the judicious manifestations of doubt that comes with real skeptical thought. Zealous adherence to contradictory beliefs masquerades as much more reasonable than it actually is.

The best application of skepticism is not in challenging beliefs one opposes, but instead in challenging beliefs one holds dear. A modern day skeptic does not cower behind a wall of media carefully selected to reinforce a single ideological perspective. Rather, the exercise of skepticism in our times involves reaching out to a diverse assortment of sources in the careful search for the genuine insights people with different opinions might possess. Faith in falsehoods follows from fixation on the findings of one faction.

You do not need to feel every individual raindrop to know that a storm is wet. Yet you also should not believe it is raining just because someone pissing on your leg tells you that it is so. To my personal chagrin, contemporary philosophical literature seems to trend toward novelties like the “brain in a vat” scenario more than it delivers practical wisdom like the appropriate uses of, and limits on, skepticism as practiced by a functional human being.

It can be argued that disengaging from reality has its uses. While subjected to torture, clinging to fantasy may be useful as a survival mechanism. Embracing it as a form of entertainment may also be satisfying. In times of great emotional distress, it can be argued that belief that could not withstand skepticism is a more desirable alternative than accepting great loss or crumbling in the face of great peril. Yet for any situation where measured philosophical discourse is appropriate, it seems clearcut that nothing offers better outcomes than making the best possible effort to engage with reality.

The limits of perception and cogitation provide us all with good reason to question our own beliefs. That so many of our beliefs are filtered through many others’ perceptions and cogitations make this sort of questioning all the more worthwhile. It is the true skeptic who seeks out the best challenges while constantly learning from encounters with unexpected information. There are also terms for people who wallow in a single group’s orthodoxy and self-congratulations, but to me it seems misleading when they call themselves “skeptics.”

What You Should Think About Religion in Politics

October 10, 2007

. . . but it is very important for people not to be haughty in their religion, and there’s all kinds of admonitions in the Bible — haughtiness, rightfulness is a sin in itself.”

–George W. Bush

One of the most disturbing trends in modern American politics has been the legislation of morality. Governor Rick Perry (R-Texas) displayed a profound lack of good sense when, faced with a concerned citizen’s question, he expressed a belief that political leaders have a duty to “legislate morality.” I believe this approach slightly misses the point of legislation and completely misses the point of the American Revolutionary War, not to mention other great American civic achievements like the Bill of Rights.

It is understandable why some people might have difficulty seeing this area as problematic. For many, a morality derived from religion is the wellspring of all that is thought to be good. For far too many of those, this includes the capacity to judge others as evil. It is unreasonable to expect all Americans, or even that minority motivated enough to participate in elections, to have a coherent philosophy informed by post-conventional moral thinking. On the other hand, for a leader of millions to lack such a useful faculty of judgment . . . is our political process really that bereft of selectivity?

Of course it is, but that is beside the point. My concern is that this nation, with its cultural foundation established by colonists intent on practicing religious beliefs at odds with life in an increasingly urbane England, should never go down the path of inflicting punishments on citizens unwilling to abide by religious strictures. Without really thinking about it, one might well believe this is an argument for legalizing murder and rape.

Of course, it is not, and that is very much the point. Legislation to ban murder and rape can be justified without any appeal to religious thought. Society as a whole is safer and more prosperous to the degree that innocent people can be protected from physical assault. Independent of any appeals to tradition or scripture or theology, there are enough strong arguments to constitute a compelling case for the criminalization of violent attacks.

When it comes to American public policy, only universal good makes sense as a touchstone for validating new laws. Of course this good need not extend across the literal universe, but it must apply to people of any faith. This includes people with no faith whatsoever. Respect for the Constitutional assurance of free religious practice provides a technical basis for upholding this standard. Respect for those victimized by predictable outcomes of legislative morality rooted in any specific faith or religious doctrine provides strong rational basis for upholding this standard.

It is not unreasonable to characterize the United States as “Christian” on a cultural level. Most of our institutions respect Christian holidays, and in most communities talk of religion implies that the subject is Christianity. However, it is both unreasonable and untrue to characterize the United States as a Christian nation in any legal sense. With painstaking care, the founders of this nation set out to establish a secular government in which a plurality of religions, in spite of disagreements in areas like virtue or sin, could co-exist in peace and harmony.

Every time a public figure drags religious convictions into a political discussion, it is (at least) a very small betrayal of our domestic tranquility. In those instances when it is not insincere pandering, it also manages to be a betrayal of reasonable civic discourse. One of our most popular Presidents, John F. Kennedy, went to great lengths during his national campaign to establish that, while he was earnest in his faith, he would never allow a religious concern to drive him to act against the best interests of this nation or its people.

Today, particularly with one of the two entrenched parties, it seems as if candidates are tripping over each other to demonstrate how quickly they would let their faith take precedence over their commitment to secular governance. While a sizable chunk of our own nation applauds some public figures’ refusal to accept the role of biological evolution in shaping the human form, the rest of the civilized world (perhaps along with the rest of our own population) looks on in stark dismay. Even if there were to be a President Huckabee, I doubt grant money would dry up for continued studies into archeology and natural history on subjects more than 10,000 years old. However, I wouldn’t expect policies from that White House to promote great strides in American biotechnology or science education either.

At this point perhaps some readers are thinking, “well, as a good Christian, I have nothing to worry about if public figures indulge in legislating their personal morality.” However, this is a much trickier matter than it seems at first glance. Just what is a Christian? By this I do not mean to indulge the sectarian invective Mit Romney has been receiving lately. Rather I want to call attention to the wandering standards of America’s most vocal Christians.

A religious movement that could be thought of as an ancestor to modern evangelical churches was at the heart of alcohol Prohibition. Though the faithful read scripture holding that the first miracle of Jesus involved transforming water to wine, the misery associated with alcohol abuse left many Americans convinced that the stuff should be banned. As enormous congregations formed around charismatic leaders, all manner of potential social movements could have emerged. What did emerge was a monstrous beast of political activism that led directly to one of the biggest and most painful failures in the history of American domestic policy.

The funny thing is, I believe most evangelical Christians no longer feel that there is anything wrong with alcohol commerce, or even with taking a glass of wine at dinner. The Bible didn’t change. In fact, personalities involved in megachurch leadership didn’t even change much. What really changed was that firebrand preachers no longer could maintain credibility while calling for tougher alcohol laws. Hindsight ended a movement that would never have picked up steam if informed by foresight.

Yet to possess that foresight, one must recognize that religious morality, even that held by a majority, is still a personal thing. If you believe God doesn’t want you to eat citrus on Tuesday, then by all means don’t eat citrus on Tuesday. At the same time, consider the consequences of a nationwide Tuesday orange and grapefruit ban. Does it do any good to people of faith who would voluntarily abide by the restriction anyway? Does it do real harm to people who believe differently and might enjoy a juicy vitamin-rich snack between meals?

In short, if any sort of taboo has a place in the lawbooks, it is because the allegedly sinful act is also a genuinely hurtful act. Organized religions tend to be pretty consistent about promoting humility. It requires a great failure of humility to believe that badges and guns have any place in compelling strangers to abide by your own church’s notions of right and wrong. The essence of maintaining order in a society of many faiths involves drawing this distinction and insuring that the personal nature of faith does not bleed into public policy.

To many, speaking of religion on the campaign trail may seem like a sign of personal virtue. In the context of a tolerant society blessed by cultural pluralism, it is quite the opposite. Exploiting the sympathies of religious voters may not be the dirtiest possible tactic, but it is an alternative to making a case for election into our secular government based on secular argumentation. It deliberately leads others astray from the political culture expressly insulated from religious doctrine by the Founding Fathers themselves. Simply put, making faith a matter of political consequence could only be pleasing to a deity that was intent on undermining crucial principles on which the United States of America was originally established. Does anyone believe God is truly against us in that way?