Of Anachronism

Some atheists have proposed that it is possible to be good without God. They’ve plastered the slogan on buses, developed websites, and sold t-shirts to press the point home.  In a minor spin of the same message, other atheists are saying that despite what “religious people” (or often simply “religion”) says, you don’t need God to lead a good and meaningful life.  If the meaning of these slogans is that millions of people find moral value and meaning outside the constraints of religious faith, I agree–wholeheartedly–and I think I am one of them.  I challenge anyone to a duel if they say my love of art, music and literature is deficient; and I will shoot first.

At first flush, these seem like eminently reasonable propositions–as unarguable as Dr Seuss’s assertion in Horton Hears a Who that “a person’s a person no matter how small.” It’s the language of the culture of self-esteem.  And it tells us that, despite anything Dostoevsky might have said a hundred (plus) years ago, it’s the absence of God that makes us all equally worthy; the moral universe does not collapse with his non-existence.

On the contrary, the presence of God, or at least a law-giving god like the biblical god,  creates a value system and a moral hierarchy that modern women and men find unbearable.  There is no universal human equivalence in this God’s world, only saints and sinners, law and law-breaking.  I reject that system as vigorously as do my atheist friends. There can be nothing like a human moral system–a system good for humans–apart from humanity.  Many atheists believe this– and many religious people, even if they don’t, will eventually have to face up to it.

Unfortunately, atheists at this point often try to press their case by cherrypicking the most obscene passages of the Old Testament and raising questions about the mental capacity of people who (they seem to allege) believe the verses still apply. Should parentsLapidation: fun for the whole family be permitted to kill disobedient sons after a cursory inquiry at “the city gates”?  Should fathers be able to sell daughters in slavery?  Is a woman unclean (untouchable) for sixty-six days after the birth of a female child?  Does the definition of rape depend on whether it happens near a city or in the country? Is God so petulant that he needs to destroy a world he could have made better, thus causing his non-omniscient self, not to mention his creatures,  endless trouble?

The relative ease with which these questions can be tossed aside in disdain should clue the reader to the fact that he is not reading an engineering textbook, that he is trodding on unfamiliar, primitive soil.

If you can read this, do what it says...

The script for these objections changes slightly, but the underlying assumption of an unbelief-ful realist doesn’t: The common notion is that if you point out tirelessly what a silly book the bible is people will eventually begin to read it, see the absurdity, and say “Eureka: what an idiot I’ve been.”

I think these Aha! moments actually happen in certain cases, but the great majority of believers really don’t care about the absurdities, and the more “faithful” they are to the traditions of their church, the more they will know that the tribal contexts of Old Testament justice (exception being made for the recent use of lex talionis on bin Laden) don’t form part of the living voice of religious tradition in the twenty first century–just as they haven’t for almost a millennium.

Maybe, as an axiom, unbelievers should flirt with the idea that things that are regarded as anachronistic or irrelevant by the vast majority of religious people are not the best evidence against theism.  That is why, for example, most philosophy of religion anthologies that include a chapter on “Descriptions and Attributes of God” deal with properties and not irrelevances skimmed from the pages of the Bible.

Anachronism is a putative pitfall in constructing any historical argument.  To see how, don’t think Biblical law and custom–Think Hamlet. I remember thinking, the first time I read the play, that all the violence could have been avoided if the young prince had just called the police.  (Never-mind that if that had been an option Shakespeare would not have had a tragedy)  After all, the evidence was all on Hamlet’s side.  Polonius might have testfied. Even Gertrude might have broken down and ratted on Claudius, and Claudius himself was not exactly a bastion of resolve.  Instead, it all ends badly with everyone dead, including Hamlet.  Fortunately I did not offer this solution on my final exam.  It would have been my Paris Hilton moment.

But, no doubt, you’re way ahead of me. Hamlet doesn’t call the police because there weren’t any. Armies, sure, but armies weren’t usually called in to settle domestic spats, not even ones involving murder. Shakespeare wrote the play based (perhaps) on a thirteenth century work by Saxo Grammaticus–when justice was even more primeval and unavailable than in his own day, and where honor, shame and vengeance were largely governed by family honor and local magistrates (judges)–closer therefore to the Bible than to modern practice.  Ultimately, the stories about heirs, usurpers and murder can be traced all the way back to David and Saul, or to Isaac, Esau and Jacob.

When did “crime” become a police (literally, a city) matter and not something to be dealt with in feudal or family fashion? 1822, when Robert Peel founded the London constabulary–a move opposed by many people in London (and it was, at first, just in London) because the city folk didn’t want a government agency getting between them and justice. Objections persevered north of the border in Scotland and in the Appalachian mountains of Tennessee in the tradition of clan violence. The first “bobbies” were drawn from the lower ranks of society; many were drunks and bullies–uniformed thugs who meted out justice in strange ways.  When in 1833 Constable Robert Culley was stabbed to death while breaking up an unlawful meeting, a jury acquitted the murderers and a newspaper awarded medals to the jurors. Let’s not even talk about Boston and Chicago in the nineteenth century.

Our sense of justice and the control of crime is a peculiarly modern invention. Yet we’re perfectly willing to accept (without knowing much about its evolution) that things were different–once. We don’t give a second thought to the fact that the meaning of justice has developed along with ways of enforcing and distributing it.  And without getting into the politics of a recent international event, we (many, anyway) don’t really interrogate the sentence “Justice was done” when clearly what is meant is “Vengeance was exacted.”  The recrudescence of biblical justice in exceptional cases, like poverty, is something we have to expect.

Scales--yes--but the sword is bigger

So I am curious about why the most universally abhorrent and rejected verses in the Bible should become symbolic of the entirety of the biblical world view. Why do we accept gratefully the social evolution of secular justice but deny religion the right to its own conceptual evolution by insisting it must be held accountable for things it produced in the Bronze Age? If evolution is the key to understanding how the world has come to be the way it looks to us, what’s the point in insisting that the religious landscape is unchanging?  I frankly cannot imagine a more tendentious assessment of history than that one.

The fact is, whatever he may or may not have said, you will not find Jesus of Nazareth enjoining the poor to sell their children into slavery to raise some quick cash.  But Hebrew settlers a thousand years before him probably did just that.  You will find him exhorting a rich young man to sell what he has, and give it to the poor, in order to be a worthy disciple. A thousand years before, to the extent that this history is known to us, such advice would have been feckless, almost incomprehensible.  It is similar to my wondering why Hamlet didn’t call the cops on Claudius.

Even the Hebrew Bible shows the slow and deliberate growth of a moral conscience over its millennium-long development: Like any idea that lasts longer than a day, God evolves:

This is what the Lord says: Do what is just and right. Rescue from the hand of his oppressor the one who has been robbed. Do no wrong or violence to the alien, the fatherless or the widow, and do not shed innocent blood in this place. (Jeremiah 22.3)

And let justice roll down like waters, and righteousness like an ever-flowing stream. (Amos, 5.24)

You’ve heard it said, An eye for an eye, a tooth for a tooth [Exodus 21.24]. But I say to you not to succumb to evil: but if one strikes you on the right cheek, turn to him also the other.” (Matthew 5.39f.)

None of these comments constitutes a moral system; I may not accept or believe them (especially my “obligation” to an enemy) and the Church itself has fallen shamelessly down if  the advice of Matthew 5.39 is taken at face value as a standard for all Christians.

But simple historical honesty requires us to notice the change, and along with that (note well,  my friends who tout the iron law of evolution in all things progressive) that the advantageous ethic, the one that looks for compassion and generosity rather than vengeance and payback, is the one that survives the predations of history.  Not perfectly, but more adequately.

Frankly, atheists will get nowhere with the message of “good without God” and its accompanying parody of religious ethics and its drone about the pure awfulness of the Bible. They might succeed in persuading themselves of the rectitude of disbelief by creating a litany of biblical absurdities.  But then the core principle of development, which is really at the heart of the atheist worldview, is laid aside in favor of a partial and static view of history that careful investigation won’t support.

The moral is, you can’t call the police when there aren’t any. And you can’t blame the Bible for being a “moral archive” of how human beings have changed their minds over the course of 2500 years.



Imagining Unbelief

 

 

 

My grandmother was a sturdy soul.  Her life consisted of taking care of her demanding German husband, incessant cleaning of a spotless house, speculating about the conjunction of rain clouds and her arthritis, and calling the church rectory for updates on mass times and confession.  She came from a large, loud, tuneful Irish family, pronounced film as “filum” and laughed at jokes three minutes ahead of the punchline.  “Hey Nonnie,” I would say, “Did you hear the one about the priest and the chiropractor?”  The laughing would start ere the words were out of my mouth.

She was patient, gullible, superstitious, carping and kind.  She didn’t like dogs or most of her neighbors, squinted at dust, sermons about Mary, and occasionally at me.  If she had secrets or dark corners to her existence they were buried with her and will remain forever unknown.

She now exists in photographs–often with the image of my grandfather standing in the background with a slight frown–not wishing to be in the picture but unwilling to move entirely out of range.

The photographs are important because when they were taken–mainly in the 1960’s–pictures were a bit of trouble: camera models, film, exposures (as in number of), light and focus were part of the vocabulary. No snapping your cellphone at any stationery or moving object that caught your fancy and then uploading images of you and your best friends by the dozen for the delectation of complete strangers.

I have a theory that the less complicated picture-taking and image- making have become the less sophisticated our memories and imaginations have become–a complaint some social theorists have leveled at “comprehensive” museums and zoos.  Imagination is not stretched.  Memory is not exercised.  Connecting impressionistic dots, sometimes captured years apart, is not required.  We live in the eternal present of the utterly familiar and the easily available Now. History is not needed to explain the familiar.  We know all about it. Thus history is a primary casualty of the widespread feeling that the unfamiliar–especially the past–is alien to the Now.

The tandem growth of religious illiteracy and EZ atheism emerges from the same matrix, one where what is “new” is regarded as good and what is old, or requires time, patience and interpretation, is regarded as irrelevant.  As the cultural gospel of America has always cherished this principle anyway (“A country without history for a people without memory”) the imagination crisis is especially prevalent in the USA.   Religious crudity is nowhere cruder or saturates politics more thoroughly or with greater dull predictability.  Discount atheism, especially of the new and in-your-face variety, is nowhere more disagreeable or less philosophical.

Henry Ford: "History is bunk."

It is enough for the American Catholic to know when the pancake breakfast begins (“after the 9 o’clock”), never mind the aesthetic torpor that his church offers as a sedative for his under-active conscience or the essentials of the faith he never bothered to learn.  It is enough for the liberal protestant to know that a collection is being taken up for Tsunami victims and for the conservative Christian to live in the cozy knowledge of Jesus’ saving grace–which entails the belief that abortion means killing babies and that Democrats want to demolish churches and put up mosques. It is enough for the atheist to see the deformed opinions of the religious majority as proof positive that he is right: God doesn’t exist and religion is for imbeciles.

The fact is, all four of the above have developed their beliefs through packthink.  Stem cell research does not entail killing babies.  America is not a Christian country.  Believing in God is not the same as belief in elves, fairies, and the Loch Ness Monster.  To be fair, the Catholic did not arrive at her position by reading Aquinas or the Protestant by reading Jonathan Edwards or the Muslim fanatic by reading Ibn Rushd or the atheist by reading Julian Huxley (an atheist supporter of Teilhard de Chardin, a Jesuit who had read Aquinas).  They got there by reading pamphlets and the back end of cars.

Julian Huxley

What each group seems to be happy with is the discounted version of the “faith” they have chosen to embrace.  Coming “out” atheist, a mildly cool social stance similar to coming out gay in the nineties, requires the same level of intellectual commitment as coming out Christian, a mildly cool stance of the 1970’s when unseen forces (in Washington) convinced the believing masses they were in for a new persecution by neo-pagans, secular humanists and freedom-hating liberals.

Our presentism, symbolized in the free flow of limitless images and text messages, no longer needs ideas to survive.  That is why bumper stickers have replaced chapters in books as the all-you need-to know summation of belief and unbelief.  “My Boss is A Jewish Carpenter,” “I Support a Baby’s Right to Choose.” “‘Worship Me or I will Torture You Forever’-God,” “Organized Religion: The World’s Biggest Pyramid Scheme.” The hostility among groups and even within groups is not about ideas but about what one side is prepared to believe about the other: fakery not fact, histories robbed of historical location and philosophical positions devoid of premises and analysis.  It is a contest for followers lifted out of the Forum and plonked down into the Colosseum–where both sides will eventually lose.

Which brings me back to the lessons we can learn from photographs.  It isn’t the case that religion has not evolved.  But it is the case that religion has been, in evolutionary terms, unsuccessful in explaining itself to the twenty-first century–and to much of the twentieth. The increasing drowsiness of the flock when it comes to core doctrines may be a blessing for beleaguered theologians who otherwise would have to go on defending what the faithful have ceased to care about.  “Average” believers have defaulted to ground where they are more comfortable–to social issues and sexual ethics, buoyed by a thin belief in scriptural authority and a woeful lack of information about the warrants and religious justification for their commitments.  As religion can only thrive when its explanatory mechanisms are coping with change, its explanatory failure will ultimately prove to be catastrophic, and no new theological idiom will arise to save it.  In my opinion, this has already happened, and not only in liberal and radical circles.

This should serve to make atheism triumphant, but it doesn’t.  If theology has lost its voice and credibility, atheism has lost its imagination and coherence. It has done this by offering, instead of a vision of the godless future, the absurdities and atrocities of religion as the sum total of its own rectitude.  There is nothing wrong with itemizing the failures and hypocrisies of religion; but it does get repetitious after a while, and then the question becomes the Alfie question: What’s it all about?

And there is this detail: The errors-of-religion-motif does not originate with atheists but with religion.  It goes back to the reform movements of the late Middle Ages, and to the Reformation itself, unique among the chapters of western civilization in its brutal treatment of popes, doctrines and sacraments.

Reformation cartoon of the Pope as Antichrist

Religion has traditionally been the best ensurer of reforms within religion, controlling the excesses and extremities of the religious appetite for a thousand years.  It did this and was successful in keeping the beast from devouring its own tail by offering better ideas, different “truths,” a simplified diet and an accommodating attitude towards movements that would finally grow up, leave home, and not write back–secularism and humanism to name two.  What it never did, or was never prepared to do, was to offer no religion in lieu of bad religion.  It has survived into an era where many opponents have joined the chorus that all religion is bad religion.

Yet for atheists to assume that their rejection of God is anything more than an opinion based on snapshots of what they know about Catholics, Jews, Muslims and Protestants is a misshapen view of their accomplishment.

The aggregate outrages of religion do not constitute a proof of God’s non-existence, nor establish a moral case for atheism.  The accumulation and “sharing” of snapshots of things that are plainly ridiculous about religion does not enhance the claim that unbelievers are smarter than believers.  The documentation of error is not the same as the discovery of the truth.  Ridiculing the beliefs of our distant faith-obsessed ancestors or the profanity of violence that seems to soak the pages of the Hebrew Bible, and more recently the Qur’an, belongs to other centuries: it’s been done.  It’s good for a laugh, or a gasp, not for a lesson.

And a final thing. If the contemporary atheist is really interested in the harmful effects of religion, he is up against two truisms that run counter to evolutionary wisdom: the adaptability and survival of religion, despite texts and practices assumed to be harmful to human society, and the fact that atheism has so far struggled unsuccessfully to replace religion with a new diagram of human values.  Unlike Alvin Plantinga, I don’t regard these phenomena as real facts, as “evidential” of the truth of religion, or as reliable justifications of religion based on common sense.  This is because I haven’t the foggiest idea of what it means for religion to be “true” in the sense analytic philosophy comprehends the term.

But I do have idea of what values religion expresses idiomatically and crudely in ways that have occasionally challenged the human imagination.  If religion has a survivability quotient, that can be expressed in evolutionary terms, it is a human quotient.  In their independent ways, the atheist Julian Huxley and the believer Teilhard got that much right.

Blessed are the peacemakers...

I personally believe that the survival of religion can be explained in purely rational ways, and with no guarantee of lifespan.  I also happen to believe that atheism, if it is an informed and historically critical atheism–aware of its own past as well as of the religious past from which it artificially emerges–can develop new templates for human value that test the imagination in the same way that the interpretation of images and artefacts from the human past test, and are resolved in, the imagination through religion.

The elevation of atheism from opinion to something of much greater consequence begins when we see that belief and unbelief are aspects of the same reality.   Looked at in the starkest light, belief is only the other side of unbelief.  It is not a distinction that has the valence of right and wrong. It is pretty clear which came first, what images became dominant, which ones were lost in wars, through subjugation, and by assimilation.  Just like your family album when images were scarce, real and not easily improvable, the total picture of religion that the atheist is called upon to interpret is complex and requires a thoughtful charting of the distance between the rarefied image and the inquirer, a conversation between past and present which is more than an indictment of crimes.  It requires, as Gauguin said about imagination, “shutting your eyes in order to see.”

Attribute and Affect

I have argued against theologians like Richard Swinburne that they play a dangerous game in moving from abstracted notions of God to specific characteristics of God and the doctrines of Christianity. In the long run, the snowman they build feature by feature is still snow. It will melt. Both believing and unbelieving philosophers of religion have played this game for a very long time–perhaps since the time of Aquinas–but the bottom line is: No one is an atheist on general principles. There is some X that you reject, and that X comes with attributes or “properties” attached. Any working notion of ontology requires not merely existence but attribution.

This is why the most damaging arguments against ontology, going back to the eighteenth century, begin with the criticism that “existence” is a state (being) and not a property. Anselm had argued against his hypothetical unbeliever that God is “that than which nothing greater can be conceived [to exist],” and then took the leap to existence by stating that the existence of the greatest conceivable thing can not be merely conceptual since perfection requires actuality. Anselm limited this state of perfection to being and not to racehorses or desert islands because ordinary things can be conceived in degrees but not in states of perfection. God thus becomes a supreme case of perfection existing in actuality because it cannot simply exist in the mind for — “Si enim vel in solo intellectu est potest cogitari esse et in re quod maius est” (Proslogion 2). Now that you have the snow, it is possible to add goodness (Aquinas’s summum bonum), and the so-called Omni-properties of God (knowledge, presence, benevolence, etc.) as well as the Not-properties of God: infinite, immutable, impassible, etc. Snowman, meet your maker.

It is perfectly possible to believe in snow without believing in snowmen. But in historical theology we have long come to accept that the God of the western tradition, and by and large the God rejected by the first brave souls of the pre-Enlightenment, like John Biddle in 1615, is more slush than shape–to wit, Biddle on trying to make sense of the Trinity:

“The major premise is quite clear inasmuch as if we say that the Holy Spirit is God and yet distinguished from God then it implies a contradiction. The minor premise that the Holy Spirit is distinguished from God if it is taken personally and not essentially is against all reason:First, it is impossible for any man to distinguish the Person from the Essence of God, and not to frame two Beings or Things in his mind. Consequently, he will be forced to the conclusion that there are two Gods.Secondly, if the Person be distinguished from the Essence of God, the Person would be some Independent Thing. Therefore it would either be finite or infinite. If finite then God would be a finite thing since according to the Church everything in God is God Himself. So the conclusion is absurd. If infinite then there will be two infinites in God, and consequently the two Gods which is more absurd than the former argument.Thirdly, to speak of God taken impersonally is ridiculous, as it is admitted by everyone that God is the Name of a Person, who with absolute sovereignty rules over all. None but a person can rule over others therefore to take otherwise than personally is to take Him otherwise than He is.”

Granted that the early atheist thinkers were less concerned with the Big Picture than with dismantling inherited beliefs member by member. Many had long since concluded that the wheels of theology spun around doctrines rather than biblical texts, which had been gratuitously laid on or cherry picked to support beliefs that otherwise had been fashioned by councils without any scriptural warrants at all. A classic case, as it relates to Biddle’s long winded dilemma, above, was the so-called Johannine Comma. Based on a sequence of extra words which appear in 1 John 5:7-8 in some early printed editions of the Greek New Testament:

ὅτι τρεῖς εἰσιν οἱ μαρτυροῦντες [ἐν τῷ οὐρανῷ, ὁ Πατήρ, ὁ Λόγος, καὶ τὸ Ἅγιον Πνεῦμα· καὶ οὗτοι οἱ τρεῖς ἔν εἰσι. 8 καὶ τρεῖς εἰσιν οἱ μαρτυροῦντες ἐν τῇ γῇ] τὸ πνεῦμα καὶ τὸ ὕδωρ καὶ τὸ αἷμα, καὶ οἱ τρεῖς εἰς τὸ ἕν εἰσιν

and which were included by the King James translators, thus:

“For there are three that bear record [in heaven, the Father, the Word, and the Holy Ghost: and these three are one. 8 And there are three that bear witness in earth], the Spirit, and the water, and the blood: and these three agree in one…”

Is this not the Trinity, beloved of both Catholic and Protestants since the fourth century? Well, no, because the italicized words are absent from Greek manuscripts, and only appear in the text of four late medieval manuscripts where they seem to be the helpful clarification of a zealous copyist, originating as his marginal note. Think of it as new snow.

The point of these examples is that modern unbelief is highly confused about the difference between snow and snowmen, between being and somethingness. Simply put, what does it mean to say “I don’t believe in God,” if (as many atheists have reminded me) that is all an atheist is required to say to be a member of the club? My query is really the same at Robert Frost’s poetical question in “Mending Wall”: “Before I build a wall I’d ask to know/ What I was walling in or walling out,/ And to whom I was like to give offense.”

I maintain that it is impossible to accept Anselm’s ontological argument. Kant was right. “Existence is not a predicate.” The ontological argument illicitly treats existence as a property that things can either possess or lack: to say that a thing exists is not to “attribute” existence to that thing, but to say that the concept of that thing is exemplified–expressed and experienced–in the world. Exemplification requires attributes. That is why the obscure language and syllogisms of philosophy (for the above, e.g.: “S is p” is true iff there is something in the world that is S, satisfying the description “is p”) have never really appealed to robust varsity atheists. But Kant’s critique of ontology slices both ways: if ontology is defeasible because existence is not a predicate, it means that the statements God exists is not falsifiable because there is nothing in the world corresponding to God, at least not of the S is p variety.

Kant

Many atheists know this, and they also know that their rejection of “theism” (a very funny word derived from the Greek θεός — a god, hence, a-theism, being without such a belief) is not based on snowflakes but fully formed snowmen: the God of “Christian (or Jewish, or Islamic) “theism” who comes to us in a manifestly literary, messy, and inconsistent way in scripture. You cannot be an atheist in the abstract; you have to be an atheist in terms of attributions that have been applied in specific historical moments and which can be traced to particular historical contexts–such as the legislative “creation” of the Trinity in 325 AD. You must be walling out something.

I am perfectly at home with this kind of unbelief, comfortable with the truism that most people are atheists with respect to 99% of all the gods who were ever believed to exist. The statement is inadvertently poignant because it suggests that what we find it easy to contradict or reject are specific “attributes” or characterizations, and then to construct from these a more complete rejection of the whole picture. Every clever schoolkid knows the game and the logic: How can a God who is all good tolerate famine, cancer, premature death? How can a God who is all-wise put the prostate near to the male urinary tract (was he cutting costs?); Why would a God who is all powerful not create us, like Adam, in a post-adolescent, decision-making state free from high school, acne, and nagging parents? Note that what is being rejected are the attributes laid on this God, attributes which are construed from “S”: the state of existence as we know it.

Conveniently, for unbelievers, the rejection of attributes is facilitated by books thought to reveal the nature and purposes of God himself, especially the Bible and the Koran. The existence of texts that were never designed for use in philosophical and theological argument is a treasure chest for unbelievers–full of informal literary proofs that the God made from scriptural snow doesn’t correspond to the God made from theological snow: His whole story is an epic tragedy that could have been avoided if he had but exercised his omniscience and power at the beginning of time, avoided making fruit trees, or refrained from making Adam, or simply said “Apology accepted” when the First Couple betrayed his sole commandment. The manifest insufficiency and limitedness of this literary deity measured by the philosophical yardstick brought into the Church with theology–moments of remorse (Genesis 6,6) and petulance (6.1-16) and violence–flood, war, disease, death–makes the job of the skeptic a walk in the Garden.

What the unbeliever discovers in an amateur way is the composite nature of tradition: God-traditions that developed in Jerusalem and Athens being spliced together with sometimes implausible ingenuity and impossible contiguity. The illegitimate move is for the skeptic to conclude that the process of development is in some sense a “system” of untruths devised by ignorant or malicious men to keep the facts hidden or science suppressed. The real story, like all real stories, is much more complicated. But science does not emerge from the total exposure of the God traditions as deliberately false–the wreckage of a false system on the shoals of fact. It arises because of the inadequacy of the explanatory power of religion: the appearance of nature beneath the melting snow, to cop a phrase from Emerson.

End of winter

I think it is important, if only at an educational level, for unbelievers to avoid the error to which their commitment easily gives rise. One is a version of what W.K. Wimsatt called in 1954, in conjunction with literary criticism, the “affective” fallacy. He used the expression to mean that the ultimate value of a piece of literature (or art) cannot be established on the basis of how it affects a reader or viewer:


“The Affective Fallacy is a confusion between the poem and its results (what it is and what it does), a special case of epistemological skepticism [ . . . which . . .] begins by trying to derive the standard of criticism from the psychological effects of the poem and ends in impressionism and relativism [with the result that] the poem itself, as an object of specifically critical judgment, tends to disappear.”

Applied to the God traditions, atheists are fairly quick to judge religion solely on the basis of its (presumed) affect on believers, such that the details of the question of God’s existence and the implications of belief for everyday life disappear. We can see this tendency especially in the writings of atheists who cherry pick the toxic texts of scripture to conclude that believers who accept such stories as true are delusional or dysfunctional. I remember listening passively at an Easter Vigil celebration many years ago as the following, called the “Song of Moses” from Exodus 15, was read out:

Then Moses and the Israelites sang this song to the Lord:

‘I will sing to the Lord, for he has triumphed gloriously;
horse and rider he has thrown into the sea.
2The Lord is my strength and my might,*
and he has become my salvation;
this is my God, and I will praise him,
my father’s God, and I will exalt him.
3The Lord is a warrior;
the Lord is his name.

4‘Pharaoh’s chariots and his army he cast into the sea;
his picked officers were sunk in the Red Sea.*
5The floods covered them;
they went down into the depths like a stone.
6Your right hand, O Lord, glorious in power—
your right hand, O Lord, shattered the enemy.
7In the greatness of your majesty you overthrew your adversaries;
you sent out your fury, it consumed them like stubble.
8At the blast of your nostrils the waters piled up,
the floods stood up in a heap;
the deeps congealed in the heart of the sea.
9The enemy said, “I will pursue, I will overtake,
I will divide the spoil, my desire shall have its fill of them.
I will draw my sword, my hand shall destroy them.”
10You blew with your wind, the sea covered them;
they sank like lead in the mighty waters.

11‘Who is like you, O Lord, among the gods?
Who is like you, majestic in holiness,
awesome in splendor, doing wonders?
12You stretched out your right hand,
the earth swallowed them.

Invested with the spirit of Monty Python, I struggled not to laugh: God is great–just look how many Egyptians he killed, how many wives would now be husbandless, how many daughters fatherless. The vast majority of worshipers around me listened inattentively. Some slept. It was the drone of words. The same liturgy would have been performed in 1278. But no one would have heard very much because it would have been executed only in Latin.

To be affected by such passages (even if the effect is indifference) is a function of human perception. To conclude that the people who endure such banality in the name of religion need to be rescued from their belief in the God who seems to like to drown people or reduce their sinful cities to ashes is the affective fallacy. For every smitten, leprous evildoer and every reference to Israel behaving like a whore, there are passages of immense beauty, human pathos, literary quality and even historical importance.

To deny this human quality is to make the text disappear in the interest of sticking to a narrow and unformed reaction to it, normally based on a lack of familiarity with Hebrew (or Hellenistic) literary tradition, story telling, and historical context. Ironically, it is precisely this same lack of familiarity that permits a fundamentalist to accept “the Bible” in its undifferentiated and inspired totality as the word of God–whose imperfections can be overlooked as part of a divine plan that the book does not reveal in its entirety: 1 Corinthians 13.12.

A healthy skepticism is always preferable to uninformed credulity. But I maintain that unbelievers are often terribly credulous when it comes to their view of the positions they have taken. The fact that biblical passages can be shocking to modern sensibilities has no bearing on their “truth” at a literary, cultural, or experiential level. Nor can the value be determined by taking an average of nice texts and nasty texts without exploring individual judgments and categories. “Everything,” Jacques Barzun once told a resolute graduate student who had made up her mind about what a poem really meant, “is a seminar.” Without the seminar, we turn impressions into conclusions, and that is where the affective fallacy leaves us.

Barzun

To say that one does not believe in the God whose attributes are those (more or less, and with no consistency) described in the Bible puts the unbeliever in the company of hundreds of thousands of believers. To say that one does not accept the God of theology, with or without the reconcilable attributes of literary biblical tradition, probably would not greatly reduce that company.

The remaining issue, as John Wisdom once put it, is whether believing in a God without attributes is possible at all, or no different from not believing in God.

Atheist Denominationalism

Most atheists have never read H. Richard Niebuhr. That’s too bad. Because now that unbelievers are fighting with each other about how much of God not to believe in, they have a lot to learn from the battles fought among God’s people for primacy of position.

Niebuhr was primarily an ethicist and while influenced by philosophers and theologians as far apart as Barth, Troeltsch and Tillich, he was solidly grounded in the reality of social change. He knew that since the Protestant Reformation Christianity had become restless and incoherent. When monolithic belief in God’s holy church and her sacraments was demolished by the phenomenon of “fissiparation” (churches quarreling over picayune differences about inconspicuous doctrines and forming into ever more minor sects), the stage was set for a religion that could hardly claim to be what Christ had in mind when he expressed the wish that ‘all may be one’. Not of course that Jesus was speaking, if he was speaking, of the church when he said that.

Countries around the world experienced the Protestant Reformation in different ways: Europe at a theological level, and then in skirmishes that grew into full fledged wars. No longer able to contain the confusion by executing the odd heretic or sending the forces over the hill to rout the Huguenots, Europe settled finally into a state of religious détente that grew eventually to boredom and finally to a comparative loss of interest in religion and an acceptance of secular values. One of the reasons the “priest abuse scandal” has been so shocking to Europeans is that this generation of Irish, Italians, Germans and French have a hard enough time remembering the autocratic church of their grandparents’ day, when papal and episcopal fiat were good enough for relatively docile laity. It is the idea that society—the secular—stands against and above the church in all legal and judicial respects that makes the crisis almost unfathomable in modern terms.

A Protestant Scene

America experienced the Reformation as an export, a receiver nation. Whatever you might have learned about America being solidly “Christian” at its foundation is not only not true, but not true because the seventeenth century was the era when Christianity itself was being redefined. The puritans of New England did not share the religious interests of the commercial men of Massachusetts Bay, a “factorie,” and the relatively softer Baptists followed on their heels within a generation. Harvard had fallen from Calvinist grace by 1702 when Yale was founded to preserve its true religion (the mottoes are revealing: Harvard, Veritas, Yale Lux et Veritas). By then, Jews were aboard, or off ship, in Rhode Island and the first waves of Catholics were about to arrive in Lord Baltimore’s Maryland. Go a bit further south and boatloads of low church Anglicans had disembarked in Virginia decades before, and Presbyterians would squeeze into the gaps in the Carolinas, named for Charles II. Georgia (the name first suggested to a delighted George II in 1724) would be transformed by Wesley’s followers into a colony for Methodism. Go a little deeper and change colonial masters: waves of Catholics driven from New France by the pursuing forces of the British General Wolfe, would arrive in the bayous of the Mississippi Gulf region and learn to call it home.

A 'Cajun' (Acadian) Scene, 1898

The grab bag of religious immigrants that came together at the end of the eighteenth century was not an especially remarkable mix. It was a powder keg of competing denominations with explosive potential. In their wisdom, as Americans like to say of the founding fathers, the authors of the Constitution were savvy enough to make sure that religion and government should stay apart: that’s what the first amendment was devised to do. But they were equally savvy about the instincts of these displaced and largely yokel Europeans. Whether it was debt, famine, crime, adventurism, a loveless marriage, a lost fortune or religious persecution that had brought them to the New World, it was entirely likely that their faith came with them. So, what the founders gave with their right hand to government they took away again with their left by delivering to these competing sects the “free exercise” of their faith. Congress would never pass a law constraining the free exercise of religion. And in saying that they passed a law concerning the free exercise of religion. America became the most religious nation on earth and the most fertile field for growing new religions.

Mormon Trek

Niebuhr of course knew all of this, the son of a distinguished immigrant German theological family himself. He knew that the ragbag culture of American religion would always be a supermarket of choices–and not only that. There was something in the nature of Protestantism that was friendly to competition (as Weber had argued at the beginning of the twentieth century), from strong belief to weak belief, from Born Againism to Ask me Later. If ever Feuerbach needed confirmation of his idea that religion makes God in man’s image, the proof could be found in the American Experience.

Acadia

* * *

Modern Atheism is a continuation of the pattern of denominationalism and derives specifically from it. It is the fatal last step in the journey from strong to weak belief. Just as secularism emanates from the religious acceptance of tolerance and pluralism, necessities imposed by competing sects living in close cultural proximity over a long period of time, atheism is that point on the belief scale where God becomes not optional but impossible.

By saying that I don’t mean to suggest that atheism is religion. That is a limp, tiresome, historically uninformed debate. But atheists would be very foolish not to understand themselves connected through history and process to the developments that help us to understand the phenomenon of denominationalism. If a hard core atheist cannot believe in creation ex nihilo, it would be pretty silly for him to believe that any social or intellectual position can be equivalently wrought.

It also seems clear to me that atheists, in accepting that they have their origins not in Zeus’s bonnet but in a social process, should also accept that atheism will also experience its own denominationalism, its own sectarian divisions This process has been under way for a long time. We are seeing its latest eruption in the “debate’ between old and new atheists, as in the twentieth century in differences between religious humanists and secular humanists. Even the terminology used to express the differences (as Niebuhr pointed out) becomes crucially significant: Labeling is one of the properties of the protestant spirit. Just as it isn’t enough to say Baptist without specifying Southern, American, Freewill, Particular or Seventh Day, the day may come when one atheist will demand of the atheist sitting next to him at a bar “Old,” “New,” “Bright,” “Strict,” “Friendly” or “Prickly?”

What denominationalism teaches is that human beings despise norms. I suspect there will never be a more impressive “norm” than the rules and doctrines and liturgies of the Catholic church of the sixteenth century. If you want to know what God looked like at the peak of his game, it was then. But “then” is when the Reformation happened.

I suspect that atheism had something like that heyday in the 1940’s when it became, normatively speaking, a sexy bad boy philosophy associated with the likes of Julian Huxley and Bertrand Russell, and slightly later with (certain) existentialists, especially Camus. It had a prior history of course, and a later one. But I tend to think the potential for variegation in atheism goes back far into history. Maybe it goes back to Hobbes, maybe to Lucretius or Epicurus. But wherever it goes it is always in juxtaposition with religious values, and often enough (especially with the French) with particular religious doctrines. Read the forward to Marx’s doctoral thesis to see what I mean.

Atheism follows the religious pattern of denominationalism not only because it behaves religiously but because its central question is a religious question—or more precisely a question about religion. It should surprise no one that what we are seeing now are permissive, soft, hard, pluralistic, total rejectionist, possibilist, impossibilist, and accommodationist responses to the question of God’s existence and the “meaning” of religious experience. Why would we expect anything else?

What we can hope is that the process doesn’t take atheists too far down the denominational road as they jockey for position as the True Unreligion: Once-born and twice-born atheist is a distinction we can live without.