The Sure-Fire Atheist Rapid Response Manual

Posted on December 1, 2011

When I wrote Atheism’s Little Idea I said atheists were small. But (and this is embarrassing to confess) I had no idea how clever.

There’s a species of ant in Papua New Guinea that is so small you need a magnifying glass to see that they’re insects and not swirling grains of sand.  But drop a crumb of cheese on the ground and an army of ten zillion will appear out of nowhere, through the floor cracks where they live invisibly, and devour the cheese before you can retrieve and pop it subtly into your mouth.  They are truly amazing little creatures.  I think they are called siboyeki.  I’m not sure there is a singular, and there doesn’t need to be, because they are never alone.  They don’t believe in God either.  I assume they have short memories because they don’t seem to mind eating the same sort of cheese crumbs day after day.

When the atheists had grown tired of my “endless harangues against atheism” last year they swarmed at me, Jacques Berlinerblau, and Michael Ruse all at once. We said, in different tones, that they were playing too rough, were turning people off (including fellow unbelievers) with their flatfooted tactics, and needed to behave like adults with real arguments and day jobs.

The atheist swarm may actually have eaten the other two because I haven’t heard from them in a long time.

But it was then I learned their strange language and breeding habits:  Like all small things, their safety is in numbers. One atheist alone is hardly a match for his (or her) natural enemies, the Christian Nation, the low-wattage Dims and flabby franks like me who send mixed signals about what they really believe. But one thousand atheists on a single mission can take down a faitheist, an accommodationist and a Associate Reformed Presbyterian pre-Millennialist going through a divorce in about a minute. I’ll tell you this: if Osama bin Laden had ranted about atheists and not “the West” (where is that exactly?) he would have been cheese crumbs in October 2001.

Of course the real advantage atheists  have over ants is language.  Siboyeki can’t talk, but atheists can and some of the older ones can read, as one of them amply demonstrated to me recently by quoting a poem.  This greatly unsettled me, because until this event I’d thought that Rachel Rubinowitz, my girlfriend in college, and I were the only people in the world who had ever read Philip Larkin.  She said it inspired her to become a librarian.

I have come to be a huge admirer of how the atheists organize for their own protection and what they are able to accomplish on a low budget.  I have wondered how this is possible ever since I was almost eaten last spring. But now I know.

Almost miraculously, about a month ago,  I happened on a used copy of the Sure-Fire Atheist Rapid Response and Defense Manual and Cookbook at You’re Mama’s [sic] Bookstore in Sausalito, complete with marginal notes and exclamation  points.  It was selling for only a dollar and the owner said it had been brought in by a distraught undergraduate only days before.  The student had said he didn’t have time for the meetings anymore and had decided to become a biblical scholar.

The manual is pretty short–less than a hundred pages of double sided photocopy paper–and has only one illustration, which is a little murky. It seems to show an atheist eating a baby between hamburger buns while a little old lady with white hair tied in a not-unseductive knot says to her husband, “I told you so.”

There are too many treasures in the Manual to describe them all, so here were my favorites:

Part One

(1)  The Atheist Pledge.  ”We believe there is No God. That’s it. Full stop. End of Discussion. Move on.”

(2)  The enemy does not believe this.  He will say things like, “But…” or “Have you considered…” or “I don’t know.”  This kind of talk must be discouraged because it overthinks our position.  Overthinking is dangerous. Good men and/or women have been lost because of overthinking.

After a few good recipes in Part Two(I intend to try the Skeptical Chicken tonight) The Manual moves to specifics.  naturally, I was curious about what it might say about people–I mean overthinkers–like me. I was delighted to find that complete training is provided in how to manage just the sort of situation I’d confronted them with in Atheism’s Little Idea.

(1)  Never mind what the enemy has written: it has no merit.

(2)  Never say it has no merit; say it has no point.  Better yet, say it has no argument.  Argument is a word that implies logical development.  Say it has no logical development.  Other words to use: baseless, yawn, load of crap.

(3)  If the enemy uses quotations or historical reference points, ask him where he got them from.  Be careful, because he might have got them from somewhere.  If he responds, say “Like you know everything, right?” Or, “Who agrees with you?”  Remember: the point is to fluster, disorient and win.

(4)  Multipurpose Global Utility (Straw-Man) Argument: If  you think he and or she does have an argument but you can’t quite understand it, go to page 33: “How to Use a Straw Man.”  The Straw Man defense is a sure fire destroy-all toxin that will paralyze the enemy. Basically, it is the same as a six year old saying “You made it all up,” but sounds much better.  Plus, you don’t have to explain anything about where he uses it.

(5)  If you don’t understand the Straw Man Defense, resort immediately to one of the following:

(a) Call the enemy arrogant.  Our enemies are all arrogant or they wouldn’t be blogging against us so this is bound to work.  Words like “pompous,” “misguided,”  ”pathetically out of touch,” “incredibly uninformed” and similar expressions will work just as well.  Try to avoid “full of shit” and if you use the word “erroneous”: remember there are two r’s.  (see also spelling tips under accommodation/accomodation/akomodation).

(b)  Call the enemy ignorant. This is basic because anyone who disagrees with atheism is ignorant.  You can also use some of the same words: incredibly ignorant, unbelievably ignorant; I don’t know how you’re able to tie your shoes-ignorant.

(c)  Call the enemy unimpressive Make any really important points look like insignificant points.  There are various ways to do this:  ”lol, r u serious”; “omg, I can’t believe how ridiculous when you said…,”  ”When’s your next appointment with your psychiatrist” or “U BLEEV IN CEILIN CAT DOAN U?”

(d)  Call the enemy boring.  ”I tried to read this but I fell asleep at the first comma.”  Better yet, “I would have fallen asleep at the first comma but I had to stop for a pee at the verb.”

(e)  Say that the enemy hates science and reason.  See p 98: “What are Science and Reason?”

(f)  Say that the enemy is confused.  ”I don’t know where to begin discussing this cartload of doodie and unless I knew in advance there was gold buried in it I don’t think I even want to try.”  This one always works, and you can use other words besides doodie.

(6)  If you find that a website is “moderated” say that it violates the fundamental right of Free Speech guaranteed to atheism in the Constitution. It is what our atheist forefathers like Thomas Jefferson and Benjamin Franklin and probably also Abraham Lincoln and Shakespeare fought and died for, or would have died for if they’d had to. Not publishing our comments is a form of defamation (p. 67) and discrimation, which is forbidden by all relevant anti-discrimination laws: see p. 80, “All Relevant Anti-Discrimination Laws).

Accuse the enemy of running his blog site like a Gulag.  Badger the enemy repeatedly saying “Where is my comment?”  ”You don’t believe in free speech, do you, you arrogant stuffed pork pie, ” or “No wonder your pitiful little site gets so few hits; you run it like North Korea.”  If he shoots back some irrelevant comment about all your exampes coming from atheist states like North Korea say “That’s just what a pompous, boring, pork pie would say.”

(7)  The art of the quibble:  Throw the enemy off balance. Coordination is everything.  Nothing is too small for a quibble (rhymes with nibble).  And almost anything counts as a quibble: For example asking for page numbers, correcting grammar, and wondering if the enemy is jealous of Richard Dawkins’s unparalleled success as an atheist writer are good starts. But if he gets scrappy,  move on to statistics, as in “You say that atheism is in decline; I’d like to know how you know this?”  ”You say that religion is responsible for the preservation of learning traditions and the rise of universities; can you give me an example?”  Be ready to say “I didn’t think so,” or “Gotcha” while he’s thinking.

Atheist dramatist William Shakespeare
Atheist dramatist Shakespeare

(8)  Limited Purpose Fake Fallacy Multipurpose Argument: The Courtier’s Reply: Remember that since God does not exist, anybody can be an expert on God, which is like being an expert on nothing.  Scientists can be more expert on God because a majority of scientists don’t believe in God. The more you know about God the less you know since God does not exist: don’t be afraid to say this.  If someone believes in god, or knows someone who does, that gives them no right to say you don’t know what you’re talking about just because you don’t.  In fact, just the opposite.   The enemy will pretend to know history and all kinds of other irrelevant stuff.  Remember: history is for losers, but winners write history.

The flipside is that context is nothing.  We are factualists and factuals are factuals.  The enemy might say, “Atheism really didn’t crop up in a big way before the eighteenth century.” This is not true, as our Big List of Famous Atheists proves.   String together some quotes from our Big List of Atheist quotations (mostly made up or paraphrased, but who cares) to embarrass him.  As Socrates once said, “Winning is not about research.”

Also remember, this is a cause: Since atheists are a minority it follows that we are persecuted because all minorities are persecuted. The enemy might say something arrogant such as, “Unlike religious dissent and heresy, atheism has largely been beyond the purview (pûrvym n: scope) and concern of religion, and atheists as atheists have seldom been persecuted by the religious establishment.”

Start with “This is a straw man.”  Then say, “How do you know this?”, and then, “I can’t find page 76 in the book you quoted,” or “This is incredibly ignorant.” If the enemy becomes argumentative, say he is using his knowledge to discredit you and accuse him of using the Courtier’s Reply and zing him with the information that the Bible isn’t true or say something in lolcat. It never fails because we have the factuals on our side.

a Courtier getting a reply ready

See also: p. 67: “Tricks the Enemy Might use to Confuse Us” and p 43: A Short Guide to History for Atheists].

P. 43  A Short Reference Guide to History for Atheists

1.  0   God doesn’t create the world, hello?

2.  Big Bang, maybe 13.7 billion years ago. Awesome.

3.  Later: We climb out of the slime

4.  Early on: Greece and Rome.  Socrates killed for being a atheist.

4.  Later: Christianity invents Jesus.  Bible finished.

5.  Not much later: The Dark Ages.  Nothing of any importance happens.

6.  After that: The Renaissance: a bunch of painters paint god and Mary and Jesus. Things r rly gettin borin.

7.  Not so long ago: The Enlightenment. Gud mornin.

8. Then: Darwin, beginning of the messianic age. Evrythin iz kewl nao.

9.  Finally: Everything from Darwin to Us.

Atheism’s Little Idea

Lieber Gott: Bitte kommen Sie wieder. Wir sind sehr traurig. Ihre Gottheit steht außer Zweifel. Ihr, Faust.

I do apologize.  It seems that everything I write these days is anti-atheist. And who can blame my unbelieving brethren for assuming I am fighting for the other side.  Perhaps I should be, since modern atheism is hardly worth defending.

To be brutal, I cannot imagine a time in the history of unbelief when atheism has appeared more hamfisted, puling, ignorant or unappealing.

Is this because its savants are also described by those adjectives, or because their fans are just being fans, merchandising the cause: t-shirts, coffee mugs, quick fixes, blasphemy competitions, and billboard campaigns? (Axial tilt is the reason for the season: Honest Jethro,  I thought I’d never stop laughing). I mean, who are we unless someone is offended by who we are?  What good is blasphemy if no one is getting their knickers in a knot anymore, for Christ’s sake. How can we “come out” when there’s no one standing outside the closet to yell “Surprise!” at? And, by the way you churchy jerks: we are victims.

Atheism has become a very little idea, an idea that has to be shouted to seem important.  And that is a shame, because God was a big idea, and the rejection of the existence of God was also a big idea, once upon a time.

There was nothing “mistaken” about belief in God, and the fact that there is probably no god does not lessen his significance.  No distant galaxy of more intelligent beings has sent us an error message about the God thing. God is no more “wrong” than a carriage is wrong in relation to a JAG XKR-S.  Expensive strokes for modern folks, but as carriage is to sleek design and comfortable travel, so god is to modern understanding.  Notice: I did not say science.  I said modern understanding, because only a portion of modern understanding is shaped by science and god is not an object of scientific thought.  If the question of God could be reduced to a simple scientific verdict, the eminently nasal Richard Dawkins could shut his repetitive trap. As it is he has to keep talking.

Atheism has become a very little idea because it is now promoted by little people with a small focus.  These people tend to think that there are two kinds of questions: the questions we have already answered and the questions we will answer tomorrow.  When they were even smaller than they are now, their father asked them every six weeks, “Whadja get in math and science?” When they had children of their own, they asked them, “Whadja get in science and math?”  Which goes to show, people can change.

They eschew mystery, unless it’s connected to a telescopic lens or an electron microscope or a neutrinometer at the Hadron Collider at CERN. “Mystery” is not a state to be enjoyed or celebrated like a good wine or a raven-haired woman with haunting and troubled eyes: it is a temporary state of befuddlement, an unknown sum, an uncharted particle, a glimpse of a distant galaxy, the possibility that Mars supported microbial life.

I get excited by all of these things, incidentally.  They are the sorts of things that put the sapiens (twice) in the name of our species.  Our ability to figure things out is almost mysterious, but not at all miraculous. In fact, a crucial part of modern humanism is the celebration of our continued and accelerating ability to make sense of the universe and where we are in it.

Strictly speaking we do not need to know as much as we already do to survive and there is no guarantee that knowing more will guarantee our survival.  So it’s wondrous indeed that we care enough to put knowledge at the top of the human agenda.  The same mysterious attitude it was that pricked us into turning the vast and starry skies into the creation of a divine being who loved us, cared for us, and saved us from oblivion.

We have gradually concluded that this is probably not true: there is no such being–yet the vast and starry skies remain.  But we have not yet learned to love the universe as much as we once loved God because, as Stephen Crane once said, we know the universe does not love us back.

We lived before there was science, and we may live at some distant point–come hell, high water, nuclear catatsrophe, plague, and asteroids that don’t miss–after it.  I do not regard an umimaginable future unlikely because nothing is more unlikely than that we should understand the world as well as we do now.

Atheism has been of practically no use in formulating this world view.  It is certainly true that a majority of scientists are either unbelievers (of some sort) or unconventional believers. But being an atheist was never a prerequisite to good science.  Understanding the natural world makes good science, a world in which the mysterious exists but the miraculous does not.

Science reified (with its consort, Reason) has become the convenient alternative deity of small atheists. But this is a relatively recent phenomenon.  Most of the greatest advances in science were made by “believers.”  Without getting into the mud over Einstein (who whether a believer or not was not an atheist), Newton, Mendel, Galileo, Kelvin, Darwin, Faraday, Boyle, Planck, and on and on. But the score at the end of this risky game is not to stack theists against atheists.   Most smart people, some of whom are scientists,  are not religious in the way religious people want them to be religious or irreligious in the way atheists want them to be atheists.

Max Planck

When did atheism cease to be a big idea?  When atheists made God a little idea.  When its idea of god shriveled to become a postulate of a new intellectual Darwinism.  When they began to identify unbelief with being a woman, a gay, a lesbian, or some other victimized cadre.   When they decided that religion is best described as a malicious and retardant cultural force that connives to prevent us being the Alpha Race of super-intelligences and wholly equal beings that nature has in store for  us. When they elevated naturalism, already an outmoded view of the universe, to a cause, at the expense of authentic imagination.

Atheism has become a little idea because it is based on the hobgoblin theory of religion: its god is a green elf with a stick, not the master of the universe who controls it with his omniscient will. –Let alone a God so powerful that this will could evolve into Nature’s God–the god of Jefferson and Paine–and then into the laws of nature, as it did before the end of the eighteenth century in learned discussion and debate.

Atheism until fairly recently has been about a disappointing search for god that ends in failure, disillusionment, despair, and finally a new affirmation of human ingenuity that is entirely compatible with both science and art.

That’s the way Sartre thought of it. –A conclusion forced upon us by the dawning recognition that we are both the source and solution to our despair.  That is what Walter Lippmann thought in 1929, when he described the erosion of belief by the acids of modernity.  This atheism was respectful of the fact that God is a very big idea, a sublime idea, and that abandoning such an idea could not take place as a mere reckoning at one moment in time; it had to happen as a process that included hatred, alienation and what Whitehead saw as “reconciliation” with the idea of God.  That is what Leo Strauss meant in 1955 when he wrote in Natural Right and History that the classical virtues would save the modern world from the negative trinity of pragmatism, scientism and relativism, what Irving Babbitt (Lippmann’s teacher at Harvard) meant in declaring war on modernity and science in favour of the “inner check” of classical humanism.

In 1914, on the eve of World War I, a very young Lippmann surveyed the situation in America: “The sanctity of property, the patriarchal family, hereditary caste, the dogma of sin, obedience to authority–the rock of ages, in brief, has been blasted for us.”  A disllusioned soldier on the Western Front, Wilfred Owen asked poetically in the same year, “Was it for this the clay grew tall?” Ortega y Gasset observed that the goals that furnished yesterday’s landscape with “so definite an architecture” have lost their hold. Those that are to replace them have not yet taken shape, and so the landscape “seems to break up, vacillate, and quake in all directions.” And Yeats, elaborating on the kind of apocalyptic imagery he used in “The Second Coming” recalled: “Nature, steel-bound or stone-built in the nineteenth century, became a flux where man drowned or swam.”  We all know the verdict: “Things fall apart,” because the god at the centre could not hold.  The image was highly appropriate because it was atomic and prophetic.

My current Angst, to use that hackneyed word correctly, is that most contemporary humanists don’t know what classical humanism is, and most modern atheists won’t even have read the books mentioned in the last paragraph, and what’s more will not care.  Their atheism is an uneven mixture of basic physics, evolutionary biology, half cooked theories from the greasy kitchen  of cognitive science, assorted political opinions, and what they regard as common sense.  They fell into atheism; they did not come to it.

That’s the way  recent atheism has been, an old fiddle with one string and one tune to play: We are the world.  Get over God. If the almighty  being and his raggedy book are relevant at all, it’s simply as a record of all the stupid things human beings can think of: superstitious sorghum, toxic drivel that stopped being relevant in the century its superstitious, toxic tropes were composed.

Was it only ten years ago that relatively dumb people were saying “Duh” to obtuse comments that they were afraid equally dumb people might miss without the exclamation, usually prefaced with, “I mean like.…”   The fad was almost as annoying as the similarly valenced interjection “Hello?” which had to be said with the speaker four inches from your face, head tilted. Modern culture, this is to say, has survived the tyranny of not very bright bright-lovers, the opinionated, the anti-obtusity of the obtuse.  That’s what the atheist militia, the campaigners, the billboard mongers are: people who just say “Duh” when they are asked about the existence of God.

“In all philosophic theory,” said Whitehead, with Russell the author of Principia Mathematica and thus no slouch when it came to close reasoning and logic., “there is an ultimate which is capable of characterization only through its accidental embodiments, and apart from these accidents is devoid of actuality. In the philosophy of organism this ultimate is termed ‘creativity’; and God is its primordial, non-temporal accident.”  Hello?

Atheist Meeting?

As I completed this blog, a friend forwarded to me an appreciation of a recent meeting of a group called Skepticon, a confederation of compatible atheist groups.

The piece reminds me of nothing so much as the scene in Roald Dahl’s The Witches where the hags come together, disguised under itchy wigs as ordinary housewives, to exchange ”recipes.”

We are assured that skepticism is “a humanism” by one of the keynoters, whatever that is supposed to mean; P Z Myers and Greta Christina justified their rancid approaches to belief by saying that religion “hurts human beings” (well, that’s something to suppose, which is better than nothing to suppose), and a writer named James Croft praised the meeting’s “profoundly humanist…no cop-out approach” while David Silverman, the head of the American Atheists warned that calling yourself a humanist is, in fact, a cop out.

I mention Skepticon because to my mind the meeting is further evidence of the crisis that besets atheism.  It cannot quite embrace humanism at the margins, the solution to which for certain ecumenical atheists is to fiddle with the definition of humanism by rolling out the dough ever thinner. It cannot represent skepticism in a methodological way because science and philosophy and even theology have been there and do it.  It cannot lay claim to helping people in a direct and positive (as opposed to a merely rhetorical way)  because it isn’t, after all, a social welfare movement.

It wants like Pirandello’s lost characters, a cause, an author, something that defines it and sets it apart: science, reason, empathy, concern for human health, but ends up sounding like a nightmare version of a Miss America contestant prompted to give her world peace response.

What atheism and humanism have needed for a long time and once came close to having was a think tank to deal with the theoretical issues of these different movements. It may say worlds about the nature of atheism that this project failed, under the name of secular humanism. Think, O ye of little faith and proud of it, how many temples of learning religion has built.  No don’t: you’ll get it wrong.

But for a think tank, you need thinkers. What the atheists are left with is a stage and a microphone.

Letting Go of Jesus

Let’s pretend the year is 1748 and you have just come away from reading a new treatise by David Hume called an “Enquiry concerning Miracles.”  Let’s assume you are a believing Christian who reads the Bible daily, as your grandmother taught you.  You normally listened to her because in her day most people still could not read, and if families owned a book at all it was likely to be the Bible.

Part of the reason you believe in God, and all of the reason you believe that Jesus was his son, is tied to the supernatural authority of scripture.  You have been taught that it is inspired—perhaps the very word of God, free from error and contradiction—passed down in purity and integrity from generation to generation, and a  reliable witness to the origins of the world, humankind and other biological species.

You know many verses by heart:  Honor your father and your mother.  Blessed are the poor.  Spare the rod, spoil the child.  The love of money is the root of all evil.  –Lots of stuff about disobedient children and the value of being poor, confirmed in your own experience: there are many more poor than rich people, and children often don’t listen to their parents.

Based on the bits you have read and heard preached about, you think the Bible is a wise and useful book.  If you are a member of an emerging middle or merchant class—whether you live in Boston or London or Edinburgh—you haven’t read enough history to wonder if the historical facts of the Bible are true, and archeology and evolutionary biology haven’t  arisen to prove them false.

The story of creation, mysterious as it may seem, is a pretty good story: It will do.  As to the deeper truths of the faith, if you are Catholic, your church assures you that the trinity is a mystery, so you don’t need to bother with looking for the word in the Bible, where it doesn’t occur.  If you are a churchgoing enthusiast who can’t wait for Sunday mornings to wear your new frock or your new vest, it doesn’t bother you that there’s no reference to a nine o’clock sermon in the New Testament.  If you are a Baptist and you like singing and praying loud, your church discipline and tradition tells you to ignore that part where Jesus told his followers to pray in silence and not like the Pharisees who parade their piety and pile phrase upon phrase.  After all, the parson has said, we don’t see many pharisees on the streets of Bristol or Newport.

But what really convinces you that what you do as a Christian of any denomination is the right thing to do is what theologians in the eighteenth century, the great period after the Newtonian revolution of the seventeenth, called “Christian Evidences.”

The phrase was introduced to make the supernatural elements of the Bible (and for Christians, the New Testament in particular) more up to date, more in keeping with the spirit of the Enlightenment.

Reasonable men and women who thought the medieval approach to religion was fiddle faddle—something only the Catholics still believed, especially the Irish and Spanish—had begun to equate reason with the progress of Protestant Christianity. Newton had given this position a heads up when he suggested that his entire project in physics was to prove that the laws of nature were entirely conformable to belief in a clockwork God, the “divine mechanic.”

Taking their cue, or miscue, from Newton’s belief in an all-powerful being who both established the laws of nature and, as “Nature’s God,” could violate them at will, it seemed as though miracles had been given a new lease of life.  No one much bothered to read the damning indictment by the Dutch philosopher Baruch Spinoza, twelve years Newton’s junior, who had argued that belief in a God whose perfection was based on the laws of nature could not be proved by exceptions to his own rules.  —You can play basketball on a tennis court, but it doesn’t explain the rules of tennis very well.

Anyway, you’re comfortable with Newton, and the idea of Christian “evidences,” and all those lovely stories about impudent wives being turned into pillars of salt, the ark holding Noah and his family teetering atop mount Ararat (wherever it was), those vile Egyptians being swept up in the waters of the Red Sea, and the miraculous acts of kindness and healing, and bread and fishes recounted in the New Testament.

As a Christian, you have seen all these tales as a kind of prelude to the really big story, the one about a Jewish peasant (except you don’t really think of him that way)  getting himself crucified for no reason at all, and surprising everyone by rising from the dead.   True, your medieval Catholic ancestors with their short, brutish and plague-besotted lives needed the assurance of a literal heaven more than you do in the eighteenth century.  But in general, you like the idea of resurrection, or at least of eternal life, and you agree with Luther—

The sacred Book foretold it all: How death by death should come to fall.”   

In other words, you believe in the Bible because it’s one of the only books you have ever read—and almost certainly not even it, cover to cover.  And in a vague, unquestioning, socially proper kind of way, you believe the book carries, (to use the language of Hume’s contemporary Dr Tillotson)  the attestation of divine authorship, and in the circularity that defines this discussion prior to Hume, “divine attestation” is based on the miracles.

Divinity schools in England and America, which ridiculed such popish superstitions as the real presence and even such heretofore protected doctrines as the Trinity (Harvard would finally fall to the Unitarians in the 1820s, while the British universities came through unscathed thanks to laws against nonconformists), required students for the ministry to take a course called Christian Evidences.

The fortress of belief in an age of explanation became, ironically, the unexplained and the unusual.

By 1885, Amherst, Smith, Williams, Bryn Mawr, Rutgers, Dartmouth and Princeton colleges mandated the study of the evidences for Christian belief, on the assumption that the study of the Bible was an important ingredient of a well-rounded moral education.

Sophia Smith, the foundress of Smith College, stated in the third article of her will that “[because] all education should be for the glory of God and the good of man, I direct that the Holy Scriptures be daily and systematically read and studied in said college, and that all the discipline shall be pervaded by the spirit of evangelical Christian religion.”

But all was not well, even in 1885.   Hume’s “On Miracles” was being read, and was seeping into the consciousness, not only of philosophers and theologians, but of parish ministers and young ministers in training and indolent intellectuals in the Back Bay and Bloomsbury.  Things were about to change.

Within his treatise, Hume, like a good Scotsman, appealed to common sense:  You have never seen a brick suspended in the air.   Wood will burn and fire will be extinguished by water.  Food does not multiply by itself with a snap of my fingers.  Water does not turn into wine. And in a deceptive opening sentence, he says, “…and what is more probable than that all men shall die.”

In fact, “nothing I call a miracle has ever happened in the ordinary course of events.”   It’s not a miracle if a man who seems to be in good health drops dead.  It is a miracle if a dead man comes back to life—because this has never been witnessed by any of us.  We only have reports, and even these can be challenged by the ordinary laws of evidence:  How old are these reports?  What is the reliability of the reporter?  Under what circumstances were they written?  Within what social, cultural and intellectual conditions did these reports originate?

Hume’s conclusion is so simple and so elegant that I sometimes wish it, and not the Ten Commandments, were what Americans in Pascagoula, Mississippi, were asking to be posted on classroom walls:

No testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous, than the fact, which it endeavours to establish…

—So what is more likely, that a report about a brick being suspended in air is true, or that a report about a brick being suspended in air is based on a misapprehension?  That a report about a man rising from the dead is true, or that a report about a man rising from the dead is more easily explained as a case of mistaken identity or fantasy—or outright fiction.

The so-called “natural supernaturalism” of the Unitarians and eventually other Protestant groups took its gradual toll in the colleges I have mentioned.  At SmithCollege, beginning in the 1920s, Henry Elmer Barnes taught his students:

We must construct the framework of religion on a tenable superstructure. To do so is to surrender these essential characteristics of the older religion: (1) the reality and deity of the biblical God; (2) the uniqueness and divinity of Jesus and His special relevance for contemporary religion;  and (3) the belief in immortality.

Sophia Smith’s college had taken a new turn.  At Williams, James Bissett Pratt began his course in philosophy by telling his students, “Gentlemen, learn to get by without the Bible.”

At Yale, the Dwight Professor of theology in 1933 repudiated all the miracles of the Bible and announced to his students that

“The Jesus Christ of the Christian tradition must die, so that he can live.”

I need to remind the casual reader: I am speaking of nineteenth century America, not Tübingen and certainly not Oxford.  The American theological establishment had been so radicalized by the transcendental revolution after Emerson’s 1835 Divinity School Address that miracles had been pronounced, in most of New England, and using Emerson’s own word, “a monster.”

Emerson

This little reflection on Hume and how his commentary on miracles changed forever the way people looked at the Gospels is really designed to indicate that in educated twentieth century America, between roughly 1905 and 1933, the battle for the miraculous, Christian evidences, and the supernatural was all but lost—or rather, it had been won by enlightened, commonsensical teachers in our best universities and colleges.

Of course it was not won in the churches and backwoods meeting houses of what we sometimes call the American heartland, let alone in preacher-colleges of (what would become) the Bible belt or the faux-gothic seminaries of the Catholic Church.  If anyone wants to know how superstition survived in this inauspicious climate, the answer would have to be sought in relative population statistics in the Back Bay and Arkansas.

Hume’s logic and the theological consequences of his logic barely penetrated the evangelical mindset.   And if I were to comment, I would say that we are now involved in wars throughout the world because some people, in America, the Middle East and elsewhere,  still believe they will rise from the dead and go on to lead a life in paradise, qualitatively better than the life they had led in this world.  In other words, the failure not to believe in miracles has had consequences that are not merely theological or philosophical but political.  America, the country where miracles were first to fall,  is at war with its theological others over whose afterlife is true.

When the tide rolled out on miracles, what was left standing on the shore was the Jesus of what became, in the early twentieth century, the “social gospel.”

He wasn’t new—actually, he had a long pedigree going back to Kant and Schleiermacher in philosophy and theology.  He’d been worked through by poets like Coleridge and Matthew Arnold, who detested dogma and theological nitpicking and praised the “sweet reasonableness” of Jesus’ character and ethical teaching—his words about loving, forgiving, caring for the poor, and desiring a new social covenant based on concern for the “least among us.”

There is no doubt in the world that these words sparked the imaginations of a thousand social prophets reformers, and even revolutionaries.

In Germany and America, and belatedly in England, something called the “higher criticism” was catching on.

Its basic premise was that the tradition about Jesus was formed slowly and in particular social conditions not equivalent to those in Victorian England or Bismarck’s Germany.

Questions had to be asked about why a certain tradition about Jesus arose, what need it might have fulfilled within a community of followers, and how it might have undergone change as those needs changed.

For example, the belief that he was the Jewish messiah, after an unexpected crucifixion, might have led to the belief that he was the son of God who had prophesied his own untimely death.  The fact that the community was impoverished, illiterate, and a persecuted religious minority might have led the community to invent sayings like “blessed are the poor,” “blessed are you who are persecuted,” and “blessed are those who hunger and thirst for justice.”

But if this is so, then the Gospels really weren’t the biography of Jesus at all.  They were the biography of what the community believed about him based on their own cramped perspective and needs in a very small corner of the world at a particular time in history. How could this story have universal importance or timeless significance?

The Victorian church was as immune to the German school of thought as Bishop Wilberforce was to Darwin’s theories—in some ways even more so.  Even knowledgeable followers of the German school of higher criticism tried to find ways around its conclusions.

Matthew Arnold, for example, thought the Gospels were based on the misunderstanding of Jesus by his own followers, which led them to misrepresent him. But then Arnold went on to say that this misunderstanding led the Gospel writers to preserve Jesus’ teaching, although in a distorted and conflated form, more or less accurately.  They added their words and ideas to his, but in their honest ignorance was honesty.

Arnold’s influence was minimal. The miraculous deeds were gone; now people were fighting over the words.

John Dominic Crossan

When the twentieth century hit, few people in the mainline Protestant churches and almost no one in the Catholic Church of 1905 were prepared for the publication of Albert Schweitzer’s Quest of the Historical Jesus—a long, not altogether engaging survey of the eighteenth and nineteenth century attempts to piece together a coherent picture of the hero of the Gospels.

Schweitzer pronounced the quests a failure, because none of them dealt with the data within the appropriate historical framework.  No final conclusions were possible. We can know, because of what we know about ancient literature and ancient Roman Palestine, what Jesus might have been like—we can know the contours of an existence.  But not enough for a New York Times obituary.

Beyond tracing this line in shifting sands, we get lost in contradiction.   If Jesus taught anything, he must have taught something that people of his own time could have understood.  But that means that what he had to say will be irrelevant or perhaps incomprehensible to people in different social situations. His teaching, if we were to hear it, Schweitzer said, would sound mad to us.  He might have preached the end of the world.  If he did, he would not have spent his time developing a social agenda or an ethics textbook for his soon-to-be-raptured followers.  (Paul, whoever he was and whatever he was trying to do, certainly knows nothing about ethics—just some interim rules to be followed before the second coming of a divine man named Jesus).

Schweitzer flirts most with the possibility that Jesus was an eschatological prophet in an era of political and social gloom for the Jews.  But Schweitzer’s shocking verdict is that the Jesus of the church and the Jesus of popular piety—equally—never existed.  Whatever sketch you come up with will be a sketch based on the image you have already formed.

He was not alone. The Catholic priest Alfred Loisy (d. 1940), before his excommunication in 1908, wrote a book called The Gospel and the Church (1903), in which he lampooned the writings of the reigning German theologian Adolph von Harnack (d. 1930). Harnack had argued that the Gospel had permanent ethical value given to it by someone who possessed (what he called) God-consciousness: Jesus was the ethical teacher par excellence.

Loisy responded, drawing on his gallic charms, “Professor Harnack has looked deep into the well for the face of the historical Jesus, but what he has seen is his own liberal Protestant face.”

In America, Jesus was undergoing a similar transformation.  In New York  1917, a young graduate of the Colgate Divinity School named Walter Rauschenbusch was looking at the same miserable social conditions that were being described by everyone from Jane Addams to Theodore Dreiser in literature.  Rauschenbusch thought that the churches had aligned themselves with robber barons, supported unfair labor practices, winked at income disparity and ignored the poor.  So, for Rauschenbusch, the Gospel was all about a first century revolutionary movement opposed to privilege and injustice.  In his most famous book, A Theology for the Social Gospel, he writes, “Jesus did not in any real sense bear the sin of some ancient Briton who beat up his wife in B. C. 56, or of some mountaineer  who got drunk in A. D. 1917. But he did in a very real sense bear the weight of the public sins of organized society, and they in turn are causally connected with all private sins.”

Like Harnack before and dozens of social gospel writers later, the facts hardly mattered.  Whether Jesus actually said the things he is supposed to have said or they were said for him hardly mattered.  Whether he was understood or misrepresented hardly mattered.

Liberal religion had made Jesus a cipher for whatever social agenda it wanted to pursue, just as in the slavery debates of the nineteenth century, biblical authority was invoked to defend buying and selling human beings. Once the historical Jesus was abandoned, Jesus could be made to say whatever his managers wanted him to say.  Unfortunately, ignoring Schweitzer’s scholarly cautions, the progressives failed to demonstrate how the words of a first century Galilean prophet, apparently obsessed with the end of a corrupt social order, could be used to reform a morally bankrupt economic system.

For those of us who follow the Jesus quest wherever it goes, it’s impressive that the less we know about Jesus—the less we know for sure—the more the books that can be written.

In what must surely be the greatest historical irony of the late twentieth and early twenty-first century, for example, members of the “Jesus Seminar,” founded in 1985 to pare the sayings of Jesus down to “just the real ones,” came to the conclusion that 82 percent of the sayings of Jesus were (in various shades) inauthentic, that Jesus had never claimed the title “Messiah,” that he did not share a final meal with his disciples, and that he did not invent the Lord’s Prayer.

But they come to these conclusions in more than a hundred books of varying quality and interest, each of which promises to deliver the real Jesus.

The “real Jesus,” unsurprisingly, can be almost anything his inventors want him to be: prophet, wise man, magician, sage, bandit, revolutionary, gay, French, Southern Baptist or Cajun.   As I wrote in a contribution to George Wells’s 1996 The Jesus Legend, the competing theories about who Jesus really was, based on a shrinking body of reliable information, makes the theory that he never existed a welcome relief.

In a Free Inquiry article from 1993, I offended the Seminar by saying that the Jesus of their labors was a “talking doll with a repertoire of 33 genuine sayings; pull his string and he blesses the poor.”

But all is not lost that seems lost. When we look at the history of this case, we can draw some conclusions.

We don’t know much about Jesus.  What we do know, however, and have known since the serious investigation of the biblical text based on sound critical principles became possible is that there are things we can exclude.

Jesus was not Aristotle.  Despite what a former American president thought, he wasn’t a philosopher. He did not write a book on ethics.  If he lived, he would have belonged to a familiar class of wandering, puritanical doomsday preachers, who threatened the wrath of God on unfaithful Jews—especially the Jerusalem priesthood. I think that is likely.

We don’t know what he thought about the messiah or himself.  The Gospels are cagey on the subject and can yield almost any answer you want.  He was neither a social conservative nor a liberal democrat.   The change he (or his inventors) advocated was regressive rather than progressive.  But it’s also possible that we don’t even know enough to say that much.

He doesn’t seem to have had much of a work ethic; he tells his followers to beg from door to door, go barefoot (or not), and not worry about where their next meal is coming from. He might have been a magician; the law (Ex. xxii. 17 [A. V. 18]) which punishes sorcery with death speaks of the witch and not of the wizard, and exorcism was prevalent in the time of Jesus, as were magical amulets, tricks, healings, love potions and charms—like phylacteries. But we can’t be sure. If he was a magician, he was certainly not interested in ethics.

After a point, the plural Jesuses available to us in the Gospels become self-negating, and even the conclusion that the Gospels are biographies of communities becomes unhelpful: they are the biographies of different perspectives often arising within the same community.  Like the empty tomb story, the story of Jesus becomes the story of the man who wasn’t there.

What we need to be mindful of, however, is the danger of using greatly reduced, demythologized and under-impressive sources as though no matter what we do, or what we discover, the source—the Gospel—retains its authority.  It is obviously true that somehow the less certain we can be about whether x is true, the more possibilities there are for x.  But when I took algebra, we seldom defined certainty as the increase in a variable’s domain.   The dishonesty of much New Testament scholarship is the exploitation of the variable.

We need to be mindful that history is a corrective science:  when we know more than we did last week, we have to correct last week’s story.  The old story loses its authority. Biblical scholars and theologians often show the immaturity of their historical skills by playing with history.  They have shown, throughout the twentieth century, a remarkable immunity to the results of historical criticism, as though relieving Jesus of the necessity of being a man of his time and culture—however that might have been—entitles him to be someone who is free to live in our time, rule on our problems, and lend godly authority to our ethical dilemmas.

No other historical figure or legendary hero can be abused in quite the same way.  We leave Alexander the Great in the fourth century BCE, Cleopatra in Hellenistic Egypt, and Churchill buried at the family plot in Bladon near Oxford.  The quest for the historical Jesus is less a search for an historical artifact than a quest for ways to defend his continued relevance against the tides of irrelevance that erode the ancient image.

The use of Jesus as an ethical teacher has to go the way of his divinity and miracles, in the long run.  And when I say this, I’m not speaking as an atheist.  I am simply saying what I think is historically true, or true in terms of the way history deals with its own.

It is an act of courage, an act of moral bravery, to let go of God, and his only begotten Son, the second person of the blessed Trinity whose legend locates him in Nazareth during the Roman occupation.  It’s (at least) an act of intellectual honesty to say that what we would like to believe to be the case about him might not have been the case at all.  To recognize that Jesus—whoever he was–did not have answers for our time, could not have foreseen our problems, much less resolve them, frees us from the more painful obligation to view the Bible as a moral constitution.

The most powerful image in the New Testament, for me, is the one that is probably today the one most Christians would be happy to see hidden away.

Its art-historical representations vary from merely pagan, to childish, to clearly outlandish.  I cannot think of one that does what I would like to see done with the event–the Ascension of Jesus into heaven.  It is simply too self-evidently mythological to appeal to liberal Christians, and not especially in favour among conservatives–though I have never understood why.

I see the Ascension as the ultimate symbol of the absence of God, the end of illusion.  The consciousness of the never-resurrected Christ, the ultimately mortal man, dawning on the crowd.  It is presented as glorification; but in reality it is perfectly human, perfectly natural: the way of all flesh: I am with you always, until the end of time.  It is the unknown author’s “Goodnight sweet prince.”  It is the metaphorical confirmation of what Schweitzer taught us: “He comes to us as one unknown.”

On Not Quite Believing in God

A New Oxonian pebble from 2010

Baruch Spinoza

 

We seem to be witnessing the rapid development of atheist orthodoxy.

I say that as someone who has fallen prey to zingers used about the heretics in the fourth century Empire: According to my disgruntled atheist readers, I am confused, angry, unsettled, provocative, hurtful and creating division, which in Greek is what heresy means.

No one has come right out and said what this might imply: that the New Atheists having written their four sacred books (a canon?) are not subject to correction. I haven’t been told that there is nothing further to study, or that the word of revelation came down in 2005 with the publication of The God Delusion. I have been told (several times) that I am mixing humanism and skepticism and doubt into the batch, when the batch, as in Moses’ day, just calls for batch. Or no batch. I have been reminded (and reminded) that atheism is nothing more than the simple profession of the belief that there is no God, or any gods. Credo non est deus.

When the first heretics were “proclaimed” (as opposed to pilloried by various disgruntled individual bishops) in 325–when the Council of Nicaea “defined” God as a trinity–a particular heretic named Arius was in the Church’s crosshairs. He believed that Jesus was the son of God, in an ordinary sense, if you can imagine it, and not eternal. The growing cadre of right-minded bishops, including his own boss, a man called Athanasius, was committed to the popular intellectual view that everything God was, Jesus was, so Jesus had to be eternal too.

Was Jesus always a son, Arius asked. Yes always, they replied. Was God always a father? Yes, always, they said: God does not change. Then what, asked Arius, is the meaning of terms like father and son?You are irredeemable and anathema to us, they replied. And they wrote their creed and gave the West a god who lasted, more or less, for 1500 years.

To this day, the only bit of the Nicene creed Christians won’t find in their prayer books is the last clause: But those who say: ‘There was a time when he was not;’ and ‘He was not before he was made;’ and ‘He was made out of nothing,’ or ‘He is of another substance’ or ‘essence,’ or ‘The Son of God is created,’ or ‘changeable,’ or ‘alterable’—they are condemned to the fire by the holy catholic and apostolic Church.” It would spoil the family atmosphere to end the prayer on a rancorous note.

I have always felt that the more you know about the history of ideas, the less likely you are to be a true believer. Studying science can have the same effect, but not directly (since science does not deal with religious questions directly) and usually (for obvious reasons) in relation to questions like cosmology rather than questions about historical evolution.

But that “challenge” kept me interested in history and to a lesser degree in philosophy, rather than causing me to throw my hands up and say “What’s the point?” I did not become an historian in order to vindicate any sort of belief, religious or political. But by becoming a historian I learned to recognize that all ideas, including God, have histories, and that the ideas of god in their historical context leave almost no room for philosophical discussions, however framed, about his existence. In fact, even having taught philosophy of religion routinely for two decades, I find the philosophical discussion almost as dull and flat as the scientistic hubris of the new atheists and their disciples.

When I took up a position as a professor of religious studies in Ann Arbor in the 1980′s, students in the large-enrollment lectures immediately spotted me as a skeptic. When I touched on biblical subjects, bright-eyed students from western Michigan would often bring Bibles and try to trip me up on details. I would always say the same thing, after a few volleys: “We are not here to test your fidelity to the teaching of your church nor my fidelity to any greater cause. We’re here to study history. God can take it.” I wish I had a better message after twenty-eight years, but I don’t.

There are two chief problems with orthodoxy–any orthodoxy. Once it establishes itself, it kills its dissenters–if not physically, then by other means. It got Arius (not before he’d done commendable damage however); it got Hus, it got Galileo, and it might’ve gotten Descartes if he hadn’t been very clever in the Discourse on Method by creating a hypothetical pope-free universe.

Scientific orthodoxies had fared no better until the modern era, the advantage of modernity being that science learned the humility of error before it began to be right. It did not promote itself as timeless truth but as correctable knowledge. It would be remarkable if science, in its approach to religion, did not follow the same process, and I’m happy to say that in most cases it does.

For all the confusion about the new atheism attributed to me in the past few months, it seems to me that atheism is not science. It is an opinion (though I’d grant it higher status), grounded in history, to which some of the sciences, along with many other subjects, have something to contribute.

Almost everyone knows not only that the non-existence of God is not a “scientific outcome” but that it is not a philosophical outcome either. So, if it’s true that at its simplest, atheism is a position about God, and nothing else, then atheism will at least need to say why it is significant to hold such a position.

It can’t be significant just because atheists say so, so it must derive its significance from other ideas that attach to the belief in god, ideas that nonbelievers find objectionable and worth rejecting. The gods of Lucretius can’t be objectionable because like John Wisdom’s god they are not only invisible but indiscernible. Consequently, atheism can not simply be about the nonexistence of God; it must be about the implications of that belief for believers.

Some of those beliefs matter more than others. For example, the belief that God created the world. In terms of the number of people who believe this and the vigor with which they are willing to defend that belief, this has to be the most important idea attached to belief in God.

Atheists who care to argue their case philosophically, will maintain that evidence of an alternative physical mode of creation defeats demonstrations of the existence of God. In fact, however, the evidence is a disproof of explanations put forward in a creation myth; and that disproof comes from history long before it comes from philosophy and science. The evidence is nonetheless poignant. But it takes the question of God’s existence into fairly complex argumentation.

Atheists might also argue that belief in the goodness of God is contradicted by the existence of natural and moral evil (theodicy) or that belief in his benevolence and intelligence (design, teleology) is disproved by the fact that this is not the best of all possible universes. These quibbles are great fun in a classroom because they get people talking, thinking and arguing. But as you can see, we have already come a long way from the bare proposition that atheism is just about not believing in God–full stop, unless you have endowed that opinion with some authority outside the reasoning process you needed to get you there. That’s what fundamentalists do.

This recognition is unavoidable because you cannot disbelieve in something to which no attributes have been attached–unless like St Anselm you think that existence is a necessary predicate of divine (“necessary”) being. But that’s another story.

Frankly, some atheists are like instant oatmeal: quickly cooked and ready for consumption.  No stove–no mental anguish, soul searching, philosophical dilemmas or affronts to ordinary morality–has cooked them.  They are quick and, to belabor a term, EZ. When I use the term EZ atheists, I mean those atheists who short-cut propositions and adopt positions based on a less than careful examination of the positions they hold, or hold them based on authority rather than on strictly rational groundsan atheist who holds a belief to be irrefragably true only because she or he has faith that it is true or a very important senior atheist, an atheist bishop, say, says so.

Most atheists, of course, do not establish their positions that way, e.g., Williams Hasker’s “The Case of the Intellectually Sophisticated Theist” (1986) and Michael Martin’s “Critique of Religious Experience” (1990) or the famous discussion between Basil Mitchell (a theist) and Antony Flew (an atheist) called “The Falsification Debate” (1955) provide important indicators about how the existence of God can be defeated propositionally. No atheist who now swims in shallow water should feel overwhelmed by reading these classic pieces.  But something tells me, most haven’t.

Recent articles by Jacques Berlinerblau and Michael Ruse have raised the broad concern that the effects of the “New atheism” might actually be harmful. Why? Because it creates a class of followers who (like the early Christians) are less persuaded by argument than by the certainty of their position. It produces hundreds of disciples who see atheism as a self-authenticating philosophy, circumstantially supported by bits of science, and who, when challenged resort to arguments against their critics rather than arguments in favour of their position.  They point to the wonders of science, the horrors of the Bible, the political overreaching of religious activists.  They also point to a mythical history of prejudice and persecution against atheists that, they may honestly believe, locates them in a civil rights struggle: to be an atheist is like being gay, black, a woman, an abused child.

Atheist Pride is just around the corner–no sorry: I’ve just seen the t-shirt.

A common criticism of the new atheists is that their journey to unbelief did not provide them with the tools necessary for such defense, or that they have found polemical tactics against their critics more effective than standard argumentation: thus, a critic is uninformed or a closet believer. Criticism becomes “rant,” diatribe, hot air; critics are “arrogant” and elitist, or prone to over-intellectualize positions that are really quite simple: Up or down on the God thing?

Points of contention become “confusion,” “divisive”; motives are reduced to spite and jealousy rather than an honest concern for fair discussion–epithets that were used freely against people like Arius and Hus, especially in religious disputes but rarely in modern philosophical discussion. The intensity with which the EZ atheist position is held might be seen as a mark of its fragility, comparable to strategies we see in Christian apologetics.

A year ago, my position on this issue was less resolute: I would have said then that new atheism is just a shortcut to conclusions that older atheists reached by a variety of means, from having been Jesuits to having been disappointed in their church, or education, to reading too much, or staying awake during my lectures. (Even I want some small credit for changing minds).

It is a fact that few people become atheists either in foxholes or philosophy class. But having seen the minor outcry against criticism of the New Atheist position by their adherents, I have come to the conclusion that Ruse and Berlinerblau are right: the new atheism is a danger to American intellectual life, to the serious study of important questions, and to the atheist tradition itself.

I have reasons for saying this. Mostly, they have nothing to do with the canonical status of a few books and speakers who draw, like Jesus, multitudes of hungry listeners. At this level, emotion comes into play, celebrity and authority come into play. Perhaps even faith comes into play. The bright scarlet A of proud atheism as a symbol of nonbelief and denial becomes an icon in its own right: The not-the-cross and not-the-crescent. And again, as we reach beyond not believing into symbolism and the authority of speakers who can deliver you from the dark superstitions of religion, without having to die on a cross, we have come a long way from simply not believing. That is what Professors Ruse and Berlinerblau have been saying.

But the real disaster of the new atheism is one I am experiencing as a college teacher. Almost three decades back I faced opposition from students who denied that history had anything to teach them about their strong emotional commitment to a belief system or faith. Today I am often confronted with students who feel just the same way–except they are atheists, or rather many of them have adopted the name and the logo.

I say “atheist” with the same flatness that I might say, “evangelical,” but I know what it means pedgaogically when I say it. It is a diagnosis not of some intellectual malfunction, but a description of an attitude or perspective that might make historical learning more challenging than in needs to be. It means that the person has brought with her to the classroom a set of beliefs that need Socratic overhaul.

An atheism that has been inhaled at lectures given by significant thinkers is heady stuff. Its closest analogy is “getting saved,” and sometimes disciples of the New Atheists talk a language strangely like that of born agains. I hear the phrase “life changing experience” frequently from people who have been awakened at a Dawkins lecture, or even through watching videos on YouTube. It would be senseless to deny that the benefit is real. And it is futile to deny that leaving students in a state of incomplete transformation, without the resources to pursue unbelief–or its implications for a good and virtuous life beyond the purely selfish act of not believing–makes the task of education a bit harder for those of us left behind, in a non-apocalyptic sort of way.

I suspect this is pure fogeyism, but life-changing gurus have minimal responsibility after they have healed the blind.  –Jesus didn’t do post-surgical care.

I could site dozens of examples of the challenges the new atheist position presents. Two from recent Facebook posts will do. In response to a Huffington Post blog by a certain Rabbi Adam Jacobs on March 24, one respondent wrote, “Thanks Rabbi. I think I will be good without god and eat a bacon cheeseburger and think of you cowering in fear of the cosmic sky fairy…” and another, “This crazy Rabbi is completely right. Atheism does imply a moral vacuum, whether we like it or not. But that doesn’t mean that we can just accept the manifestly false premises of religion just because it would create a cozy set of moral fictions for us, which is what the author seems to be saying.”

The cosmic sky fairy, a variation presumably on Bobby Henderson’s (pretty amusing) Flying Spaghetti Monster, doesn’t strike me as blasphemy. Almost nothing does. But it strikes me as trivial. A student who can dismiss a serious article about the relationship of science, morality and religion, asked, let’s say, to read Aquinas in a first year seminar would be at a serious disadvantage. A worshiper of Richard Dawkins who can’t deal with Aquinas because he is “religious” is not better than an evangelical Christian who won’t read it because he was “Catholic.” That is where we are.

The second comment suggests that atheism is “de-moralizing,” in the sense that it eliminates one of the conventional grounds for thinking morality exists. The writer doesn’t find this troubling as an atheist, because he see the post-Kantian discussion of morality as high-sounding but fruitless chatter: “There is no higher justification for any moral imperative beyond ‘because I think/feel it’s better.’” –I actually happen to agree with him. But I can’t begin a conversation at the conclusion. His honesty about the question is pinned to a view of atheism that, frankly, I cannot understand.

The essence of EZ atheism is this trivialization of questions that it regards as secondary to the entertainment value of being a non-believer, a status that some will defend simply through polemic or ridicule of anything “serious,” anything assumed to be “high culture” or too bookish.

I am not questioning the robustness of the movement, its popularity, or the sincerity of the followers. I am not trying to make new atheism rocket science or classical philology. I have never suggested it belongs to the academy and not to the village, because I know that nothing renders a worldview ineffective quite so thoroughly as keeping it locked in a university lecture hall.

The idea that there is no God, if it were left to me, would be discussed in public schools and from the pulpit. But it won’t be. For all the wrong reasons. When Harvard four years ago attempted to introduce a course in the critical study of religion into its core curriculum, its most distinguished professor of psychology, who happens also to be an atheist, lobbied (successfully) against it because it was to be taught as a “religion” course. Almost no one except a few humanists saw that atheism lost a great battle in that victory. And it lost it, I hate to say, because the professor responsible sensationalised the issue as “bringing the study of religion into the Yard” rather than keeping it safely sequestered in the Divinity School.

I want to suggest that the trivialization of culture (which includes religion and religious ideas), especially in America where trivial pursuits reign, is not especially helpful. And as I have said pretty often, that part of this trivialization is the use of slogans, billboards, out campaigns and fishing expeditions to put market share ahead of figuring things out.

Truth to tell, there is nothing to suggest that these campaigns have resulted in racheting up numbers, increasing public understanding of unbelief, or advancing a coherent political agenda. They have however potentially harmed atheism with tactics that simplify religious ideas to an alarming level (all the better to splay them) and by confirming in the minds of many “potential Brights” (Dennett) that their suspicions of atheism were well founded. Adherents of the New Atheists need to make a distinction between success as a corollary of profits to the authors and the benefit to the movement or, to be very old fashioned, the ideals of an atheist worldview.

After a long time as a teacher, I am surprised to find myself writing about this. I have often found myself thinking, “If only half my students were atheists. Then we could get somewhere. We could say what we like, just the way we like it. We could follow the evidence where it takes us–no more sidestepping ‘awkward issues’ so as not to injure religious feelings.”

If only it were that easy: I may spend the remainder of my time in the academy imploring the sky fairy to smile on my efforts and deliver me from orthodoxy of all kinds.

Robert Ingersoll: God and Man in Peoria

‘The public’ is a very strange animal, and although a good knowledge of human nature will generally lead a caterer of amusement to hit the people right, they are fickle and ofttimes perverse.” P. T. Barnum

 

Robert Green Ingersoll was born in Dresden, New York, the son of a liberal Congregational (Presbyterian) father who had a knack of offending his godfearing parishioners with his unparishionable views.

Ingersoll’s father, when his son was nine years old, had succeeded in calling himself to the attention of the presbytery and landing himself and his family in Ohio, then in Wisconsin, and then in Illinois where he died with a cloudy charge of “unministerial conduct” hanging over his head. Such charges were not uncommon in the hypersensitive religious climate of the nineteenth century and the polity of  the Congregational protestant system encouraged them.

It’s hard to determine whether Ingersoll’s dismal view of Calvinist Christianity was spun off his empathy for his father’s treatment by the church, but the fact that the elder Ingersoll found himself in dutch with the denomination so often may have had a  disposing influence.

“Bob,” as he grew older, seemed to channel his father’s view of hell (“all the meanness, all the revenge, all the selfishness, all the cruelty, all the hatred, all the infamy of which the heart of man is capable”) and the church (“[it] has always been willing to swap off treasures in heaven for cash down on earth”).  His creed was floridly this-worldly: ”Happiness is the only good; reason the only torch; justice the only worship, humanity the only religion, and love the only priest.”  He was not an atheist, quite, and did not entirely hate religion–only the existing forms of it and its narrow-minded bosses, the clergy. The term “agnostic” was gaining currency even in nineteenth century America, and he adopted it, with the following modification:

The Agnostic … occupies himself with this world, with things that can be ascertained and understood. He turns his attention to the sciences, to the solutions of questions that touch the well-being of man. He wishes to prevent and cure disease; to lengthen life; to provide homes and raiment and food for man; to supply the wants of the body. He also cultivates the arts. He believes in painting and sculpture, in music and the drama — the needs of the soul. The Agnostic believes in developing the brain, in cultivating the affections, the tastes, the conscience, the judgment, to the end that man may be happy in this world.   … The Agnostic does not simply say, “I do not know.” He goes another step and says with great emphasis that you do not know.

A man of his era, Ingersoll was also a man adrift in a country weirdly poised between superstition and progress, electric lights and religious gloom and where the native gods of puritan New England had migrated westward to combine with the strange gods of the prairie leaving religious isobars that to this day have not been adequately interpreted.

Forced once to distinguish between the Catholic and the protestant faiths, Ingersoll, who was not prone to making unimportant distinctions, acknowledged, “the Pope is capable of intellectual advancement… the Pope is mortal, and the church cannot be afflicted with the same idiot forever. The Protestants have a book for their Pope. The book cannot advance. Year after year, and century after century, the book remains as ignorant as ever.”

It was an interesting statement coming from an agnostic, and oddly similar to an argument being made at precisely the same time, but for very different ends,  by his English contemporary, John Henry Newman, on the development of Christian doctrine.

The further details of his life show that he liked reading, though he scorned formal education (“Colleges are places where pebbles are polished and diamonds are dimmed”), an attitude that sat well with the commonsense public who often formed his audiences and has remained an ingredient in American anti-intellectualism to this day.

Ingersoll read law in the apprenticeship-style typical of his day (in Illinois, where Lincoln, for whom he had unfettered admiration, also studied for the bar),  fought in the Civil war, achieving the rank of regimental colonel, and like many Republicans of his era, championed progressive political causes such as abolition and women’s suffrage.

He was admired for his language by Walt Whitman (“a fiery blast for new virtues, which are only old virtues done over for honest use again”) and for his “incipient poetry” by an overwrought Edgar Lee Masters (“He stripped off the armor of institutional friendships/To dedicate his soul/To the terrible deities of Truth and Beauty”).  Mark Twain, with whom he competed for crowds on the lecturing circuit, called him a “master of human speech.”  He died in 1899.

On the hustings, Ingersoll drew crowds at a-buck-a-pop county fair and local theatrical events and normally packed the house with his scandalous aspersions toward the mother religion (then, protestantism) of the great Midwest where his oratory had the biggest appeal. He was a draw comparable only to General Tom Thumb  and “The Siamese Twins” Chang and Eng on the P.T.  Barnum circuit.  It was an age of flim flam and credulity–imposters and their exposers–hence a great era for both the snake-oil salesmen and the commonsense multitudes who, eyes opened by a sensible man, might run them out of town on a rail.  All of this would become Zenith, Winnemac, by 1922 and mawkishly sentimentalized by Meredith Willson in 1952.

But Ingersoll was as successful on Broadway and in Boston as he was on the circuit:  An 1892 appearance in New York not only packed the theatre, said a New York Times review of his “lecture” (on Voltaire), but required three hundred seats to be added to the stage!

Ingersoll added to the standard fare for these appearances a series of lectures on religion, which he had come to believe was the root of all evil in its Calvinist form and hocus pocus in its Catholic form.

With his limited resources and access to book collections and libraries, he made do with the anti-Christian propaganda of his day, supplemented, mainly, by a few classic American texts (Tom Paine’s The Age of Reason being the cornerstone) and a little Voltaire–the two personalities being favorite topics for speechifying.  Like many of the freethinkers of his day, Ingersoll had taught himself religion–which is both the source of his originality and the reason for his limitations as a thinker and a writer.

 

Before I go a crucial step farther, let me say I have always enjoyed reading Robert Ingersoll, mainly for the honesty he brought to America’s first ‘real century”–the period between the Revolution with its unassailably golden Enlightenment origins and the Civil War with its dark and brutal acknowledgement that the country was not, after all, a Jeffersonian democracy on Greek model but a fractious compromise between inherently hateful factions. Ita sit semper.

Ingersoll understood that at the heart of the earliest stirrings of American disintegration was the unresolved question of religion, which the founders thought they had laid to rest, or at least contained, in the First Amendment to the Constitution.    When he is moved by the phantom of despair, as he was on the death of Lincoln in 1865, there is no finer craftsman on either side of the Atlantic, largely because he possessed what was then the famous American control of language–the spare style–that had been sacrificed in Victorian England for aureate mannerism.

  …People separated only by distance are much nearer together, than those divided by the walls of caste.  It is no advantage to live in a great city, where poverty degrades and failure brings despair. The fields are lovelier than paved streets, and the great forests than walls of brick. Oaks and elms are more poetic than steeples and chimneys.  In the country is the idea of home. There you see the rising and setting sun; you become acquainted with the stars and clouds. The constellations are your friends. You hear the rain on the roof and listen to the rhythmic sighing of the winds.  You are thrilled by the resurrection called Spring, touched and saddened by Autumn — the grace and poetry of death. Every field is a picture, a landscape; every landscape a poem; every flower a tender thought, and every forest a fairy-land. In the country you preserve your identity — your personality. There you are an aggregation of atoms, but in the city you are only an atom of an aggregation.   In the country you keep your cheek close to the breast of Nature. You are calmed and ennobled by the space, the amplitude and scope of earth and sky — by the constancy of the stars.  Lincoln never finished his education. To the night of his death he was a pupil, a learner, an inquirer, a seeker after knowledge. You have no idea how many men are spoiled by what is called education. For the most part, colleges are places where pebbles are polished and diamonds are dimmed. If Shakespeare had graduated at Oxford, he might have been a quibbling attorney, or a hypocritical parson.

Many of his quotes–the precursors of soundbites–are immortal: “An honest god is the noblest work of man”; and some of his intuitions about religions in general and the separation of church and state in particular are priceless.

An infinite God ought to be able to protect himself, without going in partnership with State Legislatures. Certainly he ought not so to act that laws become necessary to keep him from being laughed at. No one thinks of protecting Shakespeare from ridicule, by the threat of fine and imprisonment.

I  once presided over a same-sex marriage in Rochester, New York, using only Ingersoll’s words, which were beautiful and profound.  He would have made a great preacher, a monumental one, and, in most respects, was.  There is no new atheist who has his rhetorical power and probably, therefore, no challenger equal to him who will reach and persuade as many people.

Love is the only bow on Life’s dark cloud. It is the morning and the evening star. It shines upon the babe, and sheds its radiance on the quiet tomb. It is the mother of art, inspirer of poet, patriot and philosopher. It is the air and light of every heart — builder of every home, kindler of every fire on every hearth. It was the first to dream of immortality. It fills the world with melody — for music is the voice of love. Love is the magician, the enchanter, that changes worthless things to Joy, and makes royal kings and queens of common clay. It is the perfume of that wondrous flower, the heart, and without that sacred passion, that divine swoon, we are less than beasts; but with it, earth is heaven, and we are gods.

His mission, as he announced it in lectures resulting (in 1879) in a transcript called Some Mistakes of Moses, was to free the clergy, the schools and the politicians from a dishonesty that they are duty-bound to propagate:

Even the publicans and sinners believe reasonable things. To believe without evidence, or in spite of it, is accounted as righteousness to the sincere and humble Christian.    The ministers are in duty bound to denounce all intellectual pride, and show that we are never quite so dear to God as when we admit that we are poor, corrupt and idiotic worms; that we never should have been born; that we ought to be damned without the least delay; that we are so infamous that we like to enjoy ourselves; that we love our wives and children better than our God; that we are generous only because we are vile; that we are honest from the meanest motives.

And who would deny the prescience of these words:

It probably will not be long until the churches will divide as sharply upon political as upon theological questions; and when that day comes, if there are not liberals enough to hold the balance of power, this government will be destroyed.

###

And yet, Robert Green Ingersoll, like any pope not half so gifted with eloquence and rhetoric, was a mortal, and even free thought cannot be “afflicted with the same [man] forever.”

It is precisely the voltage of Ingersoll’s rhetorical gifts that makes him a poor prophet, someone whose clear and lucid contempt for religion, in its biblical form especially, is overpowered by a passionate disregard for the rapidly developing scholarship of his day.  The combination of spite for the Calvinism and Methodism of the circuit and the conviction (held in common with self-made poets like Masters and Whitman) that education sullies creativity was a fatal flaw in Ingersoll’s ability to see to the core of America’s religiosity.  If facts mattered however, the men of his circle–Edison, Carnegie, Ford, even Alexander Graham Bell–were men of ingenuity rather than science. Only Bell had been near a university, and then only for a month.

As a self-professed “honest man”  Ingersoll could only parse the literature and customs of ancient people as contradictions, as “preserved abominations” that he assessed from his own vantage point in the slightly schizophrenic show-me and sideshow era. He wasn’t the first freethinker who held the biblical writers accountable to standards of performance and consistency totally alien to their time and culture, but he was the most passionate:

For many years I have regarded the Pentateuch simply as a record of a barbarous people, in which are found a great number of the ceremonies of savagery, many absurd and unjust laws, and thousands of ideas inconsistent with known and demonstrated facts.  To me it seemed almost a crime to teach that this record was written by inspired men; that slavery, polygamy, wars of conquest and extermination were right, and that there was a time when men could win the approbation of infinite Intelligence, Justice, and Mercy, by violating maidens and by butchering babes.

It’s been guessed that Ingersoll approached the Bible as a prosecutor: If so, he could not have had an easier fish to fry. The texts, because of their complex history are full of contradictions, errors of fact and chronology, and instances of practices that later readers of the vernacular translations would be horrified to discover.  Ingersoll was bolstered by an active tractarian movement in the atheist cause, especially a popular booklet called Self Contradictions of the Bible (1860) by William Henry Burr–a working class pamphlet designed for use in actual debates with religious folk.  The assumption of the debaters was that the Bible was a “cure” for itself:  simply focus the attention of believers on the actual verses and they will retreat in terror from the implications of the doctrines they held to be true.  As a prosecutor who stood, as he saw himself, “unwaveringly on the side of truth and justice,” Ingersoll felt honor bound to show the guilt of the text and its promoters, “to point out the errors, contradictions and impossibilities contained in the Pentateuch.”

Unlike some of the deist critics of the century before, Paine especially, Ingersoll is unable to locate any redeeming qualities in the Bible: it is a consistent picture of human savagery. He does not bother to separate Jesus out from the pack of unworthies, though the greater part of his contempt is reserved for “Christianity” as an institutionalisation of superstition.  He finds a stark contrast between the pagan myths, which he extolls as beautiful and enriching fables that “reflect the face and form of Nature’s very self,” and the pure barbarism of the Bible, a history of violence, banality and human wickedness, fueled by power-grubbing priests, ineffectual prophets, and duped country bumpkins similar to those he encountered at Midwestern sideshows.   For Ingersoll, Jesus may as well have been a travelling magician and the apostles his pitch-men.

When it came to scripture, Ingersoll was single-minded and usually wrong.  Here he was on the origin of the Bible:

A few wandering families — poor, wretched, without education, art or power, descendants of those who had been enslaved for four hundred years, ignorant as the inhabitants of Central Africa, had just escaped from the desert of Sinai…

At that time these wanderers had no commerce with other nations, they had no written language, they could neither read nor write. They had no means by which they could make this revelation known to other nations, and so it remained buried in the jargon of a few ignorant, impoverished and unknown tribes for more than two thousand years.

The men who did the selecting [of the Bible]  were ignorant and superstitious. They were firm believers in the miraculous. They thought that diseases had been cured by aprons and handkerchiefs of the apostles, by the bones of the dead. They believed in the fable of the Phoenix, and that the hyenas changed their sex every year.

The technical errors he propagates are the standard stuff he gleaned from the atheist tracts,  an agglomeration of free thought views that bolstered his literal reading of the text as being “true” or “false,” and if false, as he saw it, then a hoax comparable to those foisted on people by carnival owners.  He takes his cue from his inability to resolve what was leading many scholars to conclude that the Bible was not the work of pastoralists–who could not have written it anyway–but city boys, many centuries later.  What scholarship was already using as significant clues to dating, Ingersoll treated as a pack of lies:

How, in the desert of Sinai, did the Jews obtain curtains of fine linen? How did these absconding slaves make cherubs of gold? Where did they get the skins of badgers, and how did they dye them red? How did they make wreathed chains and spoons, basins and tongs? Where did they get the blue cloth and their purple? Where did they get the sockets of brass? How did they coin the shekel of the sanctuary? How did they overlay boards with gold? Where did they get the numberless instruments and tools necessary to accomplish all these things? Where did they get the fine flour and the oil? Were all these found in the desert of Sinai? Is it a sin to ask these questions?

But even by the meager intellectual standards of nineteenth century America Ingersoll’s credulity towards the tracts is painfully obvious, an ignorance that extends not just to the biblical scholarship of his time which was bursting with new discoveries and theories but even the “Harvard scholarship” that had emerged before the turn of the century through the energetic promotion of president Charles Eliot who sent packs of timid young lecturers off to Germany starting in 1869 to soak in the New Criticism at Tuebingen and Heidelberg..

The fundamental error which remains a fixture of free-thought and atheist belief well into the twentieth century is that the Bible was produced by “savages”–wandering nomads and agriculturalists whose laws and ideas were vastly inferior to their cultural “opposites”–the Greeks.  In European scholarship, the complex relationship between these two strands of thought was being charted by literary men like S T Coleridge,  Matthew Arnold (who dubbed the two traditions somewhat over-generously sweetness and light), writer-translators like George Eliot (Mary Ann Evans) and biblical scholars like Benjamin Jowett, drawing on a robust boom in archaeology and text-studies in Germany.

George Eliot

Jerusalem was not an “agrarian society” in the first century CE or in the sixth century BC; it was a thriving Hellenistic trading center at a crossroads with Persia, Babylon, Syria, and Rome.  Its violent history and pattern of foreign exploitation made it both unruly and cosmopolitan, but fundamentally it was a city of merchants, scholars, priests and foreigners. It has a relatively uncontested history between the seventh century BCE and the period of the bar Kochba rebellion in the second century CE, the period during which most of the classic texts of the Hebrew Bible as well as the books of the New Testament came together.  Nomads and agriculturalists don’t write books, compose poems like the psalms, or produce even worthless histories like the ones Ingersoll mocks in the Old Testament.  While  he accepted the emerging “modern” view that Moses was not the actual author of the books assigned to him (and generally buys wholesale the then radical view that the authorship of every biblical book is concocted), he finds it convenient to use him as a  literary conceit to drive home his point that the books were written by nomads on the run from a higher civilization, Moses being the biggest flim-flammer of them all:

For the purpose of controlling his followers (Moses) pretended that he was instructed and assisted by Jehovah, the god of these wanderers….

We know that Solomon did not write the Proverbs or the Song, that Isaiah was not the author of the book that bears his name, that no one knows the author Job, Ecclesiastes, or Esther or of any book in the Old Testament, with the exception of Ezra.

These notions, all of them available in the tracts,  are actually quite important: not many laymen of his generation would have known them, or if they had would not have given them any credit. Ingersoll however did not see them as historically interesting–puzzles to be solved in pursuit of a complete picture of the biblical era.  He saw them as part of a Great Deception, in the way a mind trained in the generation of carnival barkers and fakery would have seen them.

He would impart this way of doing history to a whole century of atheists and secularists after him:  the Bible is the rude product of barbarian peoples.  A deliberate work of deception formulated by priest-craft and supported by the superstition of the masses.  It has propagated only misery and violence and discouraged education, ethics, and scientific progress. The only release from its clutches is to denounce it as the greatest hoax on earth using the commonsense that no god gave us:

Let us admit what we know to be true; that Moses was mistaken about a thousand things; that the story of creation is not true; that the Garden of Eden is a myth; that the serpent and the tree of knowledge, and the fall of man are but fragments of old mythologies lost and dead; that woman was not made out of a rib; that serpents never had the power of speech; that the sons of God did not marry the daughters of men; that the story of the flood and ark is not exactly true; that the tower of Babel is a mistake; that the confusion of tongues is a childish thing; that the origin of the rainbow is a foolish fancy; that Methuselah did not live nine hundred and sixty-nine years; that Enoch did not leave this world, taking with him his flesh and bones; that the story of Sodom and Gomorrah is somewhat improbable; that burning brimstone never fell like rain; that Lot’s wife was not changed into chloride of sodium; that Jacob did not, in fact, put his hip out of joint wrestling with God; that the history of Tamar might just as well have been left out; that a belief in Pharaoh’s dreams is not essential to salvation; that it makes but little difference whether the rod of Aaron was changed to a serpent or not; that of all the wonders said to have been performed in Egypt, the greatest is, that anybody ever believed the absurd account; that God did not torment the innocent cattle on account of the sins of their owners; that he did not kill the first born of the poor maid behind the mill because of Pharaoh’s crimes; that flies and frogs were not ministers of God’s wrath; that lice and locusts were not the executors of his will; that seventy people did not, in two hundred and fifteen years, increase to three million; that three priests could not eat six hundred pigeons in a day; that gazing at a brass serpent could not extract poison from the blood; that God did not go in partnership with hornets; that he did not murder people simply because they asked for something to eat; that he did not declare the making of hair oil and ointment an offence to be punished with death; that he did not miraculously preserve cloth and leather; that he was not afraid of wild beasts; that he did not punish heresy with sword and fire; that he was not jealous, revengeful, and unjust; that he knew all about the sun, moon, and stars; that he did not threaten to kill people for eating the fat of an ox.

Ingersoll’s god is no god for his time.  But his intolerance of myth and his energy for itemizing contradiction betrays an even more alarming blandness and indifference to patterns of civilization, story-telling, government, learning, ideas of justice, and even ideas of progress.  Commonsense, practical, “honest” men are often not history- of- ideas men, and perhaps that is why my enjoyment of Ingersoll does not translate into admiration.  He is a second rate mind in a century of towering intellectuals, and is at his worst when he implies that he is an agnostic messiah, as he does in the first chapter of Mistakes of Moses, addressed to the clergy.

Perhaps it is the fate of all autodidacts to know only about 75% of a picture, when the 25% that might have been taught by teachers could provide an understanding of the whole truth. That was Ingersoll’s fate–to be partial, and in being partial to be loudly unfair.  He loves Shakespeare, but Shakespeare loved the Bible. He believes in truth and justice, and yet never imagines that ideas of both–even more malleable than those in ancient Greek philosophers–are described in the prophets and proverbs.  He is a great contradiction–someone who takes delight in the beauty of language but insists on a bloody literalism in the pages of the Bible, whose authors loved poetry, ideas, story and language as much as the Greeks.  He detests supernaturalism among the Hebrews but does not seem to detect, or doesn’t care,  that it suffuses Greek and Latin thought as well.  Even the exhortations that form a part of his greatest speeches cannot be accepted as “true” or “false,” but only with the spirit of judgement and wisdom that the author of Ecclesiastes (perhaps his favourite book, if he had one) asks his reader to apply.

Ingersoll is never “wrong” at an emotional level;  his light in the darkness was the only light many people saw, even if they paid money to see it because it was the only show in town–legitimate verbal scandal in the calico and gingham emporiums of smalltown America.  It is hard to imagine anyone surviving an engagement like that in the current American political and religious climate without causing a riot, a thing that would sadden almost any of the great progressives of the nineteenth century, and Republicans at that.

Among later freethinkers and humanists in America–I have no doubt at all–Ingersoll has been read more often than Hume or Voltaire.  Partly this is a matter of style: even in transcribed form, Ingersoll is a good read.  He was the apostle who transmitted the plain but often plainly wrong message of the tracts to thousands of unbelievers using a showman’s method that would inspire later shapers of the secular movement.  Of his influence there can be no doubt.  But giving him the same stomping room he gave the great god of the savages, Jehovah, Robert Ingersoll was a man of his time.

__________________

There are two full-length studies of Ingersoll, both with extensive bibliographies: Clarence H. Cramer, Royal Bob: The Life of Robert Ingersoll (1952), is the best of the earlier studies, although not as good as Orvin Prentiss Larson, American Infidel: Robert G. Ingersoll (1962). A good account of the intellectual movement to which Ingersoll belonged is in Merle Curti, The Growth of American Thought (1943; 3d ed. 1964). Susan Jacoby’s Freethinkers (Metropolitan, 2004) is a valuable resource for the history of American secular thought and contains valuable information on Ingersoll and his time.  Many of the works of the Dresden collection are available online at http://www.positiveatheism.org/hist/ingermm1.htm, from which quotations used here are taken.

 

The Real Origins of Christianity

I’m moved to write this short piece by two disconnected and discordant events: one an advertisement, the other a death.

First the ad.  I occasionally receive promotional stuff from the Center for Inquiry in Buffalo, New York.  The Center was founded by Paul Kurtz, a long-term friend of mine, and until 2009 the chair of the CFI and its affiliate organizations. Recently I received news of a short course entitled  ”The Real Origins of Christianity,” ($60, including t-shirt) to be taught by a librarian who blogs and self-publishes on New Testament studies, Richard Carrier, and an employee of CFI who specializes in American philosophy, John Shook. Apparently while walking through the markets of East Jerusalem, someone sold them a magic key.

I worked for a short time within the Center and for a longer time alongside it as Chair of its Committee for the Scientific Examination of Religion (CSER).

While it lasted–for almost thirty years–CSER was a successful entrepot between professional, critical investigations into religion and biblical history and a general public that had understandably come to believe that religion was either what people did when the golf course was soggy (the benign version) or a variety of obnoxious television godhawkers in bad fitting suits, begging for money to pay their bills (the toxic version).

CSER sponsored a wide variety of conferences in its day, ranging from a groundbreaking one at the University of Michigan in 1984 (Jesus in History and Myth) which can fairly be said to have spurred a new generation of interest in the non-confessional study of the historical Jesus-question, to a 2004 conference at Cornell (Just War and Jihad), focusing on the sources of violence in Islam, Judaism and Christianity.  Its last academic conference was at the University of California at Davis in 2007, an investigation into the methods used by biblical and koranic scholars in analysing the origins of their sacred writings and traditions.

During its heyday, CSER attracted some very significant voices: Morton Smith of Columbia, the controversial “discoverer” of the Secret Gospel of Mark; Van Harvey of  Stanford, America’s leading historian of religion; David Noel Freedman of UC San Diego, editor of the Anchor Bible; James Robinson of Claremont, the compiler of the first English translation of the Gnostic gospels.. In addition to its stalwarts and recidivist contributors, it attracted a wide variety of younger scholars and international supporters as well and was growing rapidly in outreach and prestige when CFI, without Kurtz at the helm, decided to suspend it as a cost-cutting measure.

At its last significant meeting in Davis, California,  the aged and the young sages had multiple chances to interact–sometimes, as in a particularly lively and dramatic exchange between James Robinson and Arthur Droge of Toronto–to risk correction and possible embarrassment. Nothing is more energizing than watching lions defend their legacies while challengers try to gain ground.  It is a spectacle that few of the “laity” ever get to witness: scholarship in action.  Smart people correcting each other, egos exposed to the elements.

Morton Smith

And let me stop at that word.  Scholarship is fundamentally about correction, not the display of extreme or private theories in public.  Since the time of the ancient Greeks, it has been a “dialectic”– ideas getting tangled up with other ideas.  It is surgery, not sculpture, scalpels not chisels.  It requires knowing what to throw away and what to replace it with, and whether the new is any better than the old.  The word publication defines its purpose in its root: work designed for public scrutiny.

Without dialectic, which operates on the foundation of suspicion and skepticism, just as in the sciences, religious studies and biblical scholarship would still be a mere translation of texts assumed to be inviolable. Much of an older generation of biblical scholarship was just that: translation (often good translation), theological paraphrase, and noble efforts at establishing dates and points of origin based on (often spurious) reports and traditions.

Yet the danger of evading the dialectic is not just a “conservative” problem.  At another and equally dangerous extreme, private, non-dialectical theories might hold that eccentric views are inviolable because the opinions of experts only exist to be demolished–a kind of textual iconoclasm that thinks it can bypass “traditional” methods of investigation completely, even if it doesn’t fully comprehend them.  To trivialize this view ever so slightly, it is one often held by self-trained amateurs who think the greatest service they can perform for scholarship is to line all existing theories up against a wall and shoot them (and their perpetrators, if still alive) dead. Theoretically, this greatly accelerates the forward march of new opinion.

The New Trend

This is a fancy way of saying that the real origins of Christianity is not the subject for a monologue, certainly not one by amateur dialectic-avoiders. It’s a subject for argument and interpretation.  It is closer to being a dog fight–of a genteel kind–than a dog show.

At the risk of offending amateurs and enthusiasts everywhere, biblical scholarship is not for amateurs and enthusiasts.  Is is arduous and often dull work. It means learning Greek and Coptic and Hebrew and Aramaic not just well but very well, and Latin just for fun.  It means knowing how books were produced in the ancient world, what literary genres were available. –What scribes ate for breakfast that might have spilled onto their paper and what copyists (editors) were thinking that might have caused them to scratch something out.

If you were not taught this in graduate school, then you were not taught properly–or at all.  It is not the  da Vinci Code.  It is not normally fraught with exciting new discoveries or ingenious hoaxes, and when it is–as in the case of the Qumran (Dead Sea) scrolls or Nag Hammadi (Gnostic) documents, the discovery soon turns to the drudgery of translation and piecing cultural puzzles together.  In putting that puzzle together, it helps to have in your head an image of what the picture, in its totality, might plausibly look like.  That is where the drudgery pays off.

It is primarily a story of watching extravagant claims about “startling new information” fade into the reality of prosaic results.  The (shameful) fifty years that elapsed between the discovery of the Dead Sea scrolls and their complete publication was a bitter period for scholars who were interested in “putting it out there”; the thirty years that intervened between the discovery and translation of a version of the Nag Hammadi material was a little better.  True, even reputable scholars have made preposterous claims before, during, and since the process of translation and editing.

But if your interest in New Testament studies (or if any of the conspiracy- or fabrication- theories you now hold) is based on any of this work, my earnest advice to you is: Don’t quit your day job. If you think that such work is best fueled by two-hour debates on the resurrection of Jesus with fundamentalist know-nothings rather than subjecting your ideas to peer review and criticism, think again about pursuing it as a vocation. (Twenty eight years after the publication of my “controversial” study of Marcion and the New Testament, I am still patiently defending my suppositions).

P69: Marcion’s?

Having learned and taught the subject for more than thirty years, I can honestly say, I have no idea how Christianity began.  Having also read, however, most of the theories put forward by mythtics and Jesus-skeptics, I can also say, in a friendly kind of way: you’re not close to an answer.

Which brings me to the death.  I learned yesterday of the death of C. K. Barrett, who is described in his obituary, written as it happens by one of my PhD examiners, this way:

“Charles Kingsley Barrett, who has died aged 94, stood alongside CH Dodd as the greatest British New Testament scholar of the 20th century. Barrett regarded commentary on the texts as the primary task of the biblical scholar, and his meticulous commentaries have provided solid foundations for students and clergy for more than 50 years. He was a Methodist minister for nearly 70 years and, during his time as lecturer and professor of divinity at Durham University (1945-82), and in retirement there, he preached most Sundays in the city or a nearby village. His opposition to the scheme for Anglican-Methodist reunion in the 1960s brought him into contact with a wider public as a church leader, as well as a renowned teacher.”

Barrett was ancient, or considered so, even when I was a graduate student, and what Robert Morgan calls “meticulous” in his article many of us would have called unacceptably conservative.  In his time, he was considered anti-Semitic by some and stubbornly refused to revise some of his commentaries that seemed to duplicate some of the worst instincts of German theologians and scholars.

Yet in other ways, he was fair-minded and most of us had devised ways to read around his incipient Calvinism for the jewels of insight that were embedded in books like The New Testament Background: Selected Documents (1956) and The Gospel of John and Judaism (published in German in 1970, and English in 1975).  As Morgan notes, “The learned and judicious historian of Christian origins did not in his writing and lecturing allow more than glimpses of the fire in his belly.”

A year after the appearance of Marcion, and eager to have his opinion, he wrote– in response to me–a single cordial sentence:  ”Very experimental, very tentative of course.  We shall have to see.”  Because his judgment mattered, it was a more important comment than “Good job” or “I might disagree with your premise if only I could find it.”  I was especially hopeful for Barrett’s verdict because of a traditional opinion that Marcion, who, I had come to believe, is actually the author of the first written gospel, was anti-Jewish, a view I tried hard to disassemble.

Yet there it is: “Charles Kingsley Barrett, who has died aged 94, stood alongside CH Dodd as the greatest British New Testament scholar of the 20th century…” A man who did not know the Real Origins of Christianity though he knew practically everything about it.  And worse, a man with no listing in Wikipedia.

Killing “Humanism”: Definition by a Thousand Cuts

It is a — most — provoking — thing,’ Humpty said at last, `when a person doesn’t know a cravat from a belt!’ `I know it’s very ignorant of me,’ Alice said, in so humble a tone that Humpty Dumpty relented.

Preliminary: Of Words in General

Writing in defense of the language he loved and hated, H.L. Mencken wrote in The Smart Set for 1921 that “When two-thirds of the people who use a certain language decide to call it a freight train instead of a goods train, they are ‘right’; then the first is correct usage and the second a dialect.”

He was speaking of one of the minor irritants of usage that separate American and British speakers, “divided by a common language,” into those who love English because it isn’t French and those who value it primarily because it isn’t Spanish.  –The perennial war between Britspeak and Amerispeak shows illiteracy on both sides of the water, however.  About a third of our English words are French derivatives (about 28% Latin), and neither the linguistically recusant hillbillies in Tennessee nor bankers in Fleet Street could give an Anglo-Saxon farmer of the year 1065 the time of day in a language he could understand.  Lingua Anglorum non mortuus est,  but boy has it changed. Shift happens, linguistically speaking.

Modern linguistics was not really influenced by Mencken as such, but the “usage factor” has become a standard measure of what defines a language in practical terms. Living languages “obey” the habits of living speakers, described not prescribed.  If, to turn it around, two-thirds of people no longer know what a goods train is, then maybe it’s time to call it a freight train.

Besides, rightness and wrongness surely don’t hang on idiomatic differences–whether the British cringe when they hear an American saying “gotten” for “got” or “normalcy” for “normality” or “dude” for “bloke.” Whether I’m pissed, pissed off or told to piss off, I know it’s time to go home.  (It’s the syme the ‘ole world ovah.  Now, Nigel: say fævah, favour, father).  We have made so much of these issues for so long now (so bloody long now) that we all know what the other means, more or less.  And the discussion–which might have been infinitely fascinating cocktail chit in the 1920’s, when there really was a smart set, and when we began to encounter each other as hateful cousins in great numbers after a century and a half of virtual separation–is frankly a little boring.

I do agree with Tolkien, though:  American women all talk like they have a clothes peg (sorry, pin) over their nose.

Language is made up of words, and words are the primary agents of change.  Once upon a time nice meant foolish; now it means nice. Egregious meant great, as in wonderful. Awful meant what we now mean by awesome and a guy was always a bad guy—like Guy Fawkes, and now has become the most gender-neutral pronoun in the language–as in, Really, you guys. A knave was just a boy (is it anything now?) and a silly girl wasn’t a giggly maid but a virgin.  To interfere meant to have sex with, now it means interrupting someone in medias res, so to speak. Kill used to mean torture (as in “That joke just kills me”), but now is always used to mean to do away with/someone in.  I’m hot can mean a couple of things.  I’m cool, likewise.  I’m gay probably means only one thing today—because when the winds of change have done their work, old meanings can be swept away entirely.  And all that jazz.

Although not the only mechanism we possess to convey meaning, words are the most efficient because with them we can create nuance and abstraction, write poetry, form concepts, discuss the origins of the universe, fractals, Leibnitz,  and our neighbour’s mysterious parties. And while many meaning-changes or semantic shifts can be explained in terms of processes which diachronic (historical) linguistics can classify (narrowing, elevation, metaphor, antiphrasis metonymy, etc.) other words with a specific history and semantic lode are less susceptible to shift—especially when they are concept-driven or definitional.

For example.  To define capitalism as a process of collectivizing wealth and redistributing it to people on the basis of need would not get us very far in understanding the economic preferences of the Western democracies.  To define prostitution as the practice of absolute chastity before and during marriage would be at least a little confusing.  Classicism does not describe the political system of ancient Greece but its emphasis on balance, order and harmony in architecture and ideas, even though these ideals are (sort of) reflected in the political structures in Greek antiquity. Democratic, until fairly recently, did not describe an architectural style, though “Stalinist” can refer to an aesthetic or a system.  True, some words (but not definitional ones), through a process called “auto-antonymy,” become their complementary opposite (That dress is so bad! = so good, or so hot), but native speakers will know how to flip the meaning from usage and context, with a minimum of intellectual exertion.

Of a Particular Word: Humanism

The unique message of humanism on the current world scene is its commitment to scientific naturalism. Most world views accepted today are spiritual, mystical, or theological in character. They have their origins in ancient pre-urban, nomadic, and agricultural societies of the past, not in the modern industrial or postindustrial global information culture that is emerging. Scientific naturalism enables human beings to construct a coherent world view disentangled from metaphysics or theology and based on the sciences. Humanist Manifesto 2000 (Paul Kurtz)

Over at New Oxonian, I have written several pieces about the difference between movement, meaning organization-based  humanism (secular, new, religious, ethical, neo, and trans-) and wholecloth humanism, on the analogy that these snippets have been selectively cut from the much broader historical phenomenon known simply as humanism.

The snipping is primarily a British and American pastime, just as the founding of organizations to promote a certain understanding of “humanism” (secular, political, and anti-theistic) has primarily unfurled as an Anglo-American project.  Any standard dictionary will attest to the success of this project: normally the first definition given is a movement-humanism definition, with the laudable exception of Webster’s.  –Leave it to the Americans to get something right in the long run, as Churchill once famously remarked.

My contention is that this snipping away has resulted in a technical reductio ad absurdum—a lessening and deadening of the whole concept originally conveyed by the term humanism.  Linguists (Ullmann [1962] being the most famous) have called the process semantic pejoration or weakening–much like defining democracy as “one man one vote” or puritans as early American fundamentalist Christians, 10% true but 90% misleading and thus 100% wrong. The tendency to turn the phenomenon called humanism into one of its multifarious effects or “tendencies” has not only turned humanism into a parody of itself, but the whole process has been done in such an artless way that the term has lost both integrity and valence: Humanism (recall Sartre’s famous quip about “existentialism”) now means so many different things that it has ceased to mean anything at all.

Hardly better is the Humpty-Dumpty insistence by some humanist organizations that humanism is non-theistic, secular, grounded in Enlightenment endorsements of “science” and “reason,” inherently and unarguably aligned with progressive politics and social movements, and committed to global ethics and values, in a circuitous way that embraces the principles of documents like the International Declaration of Human Rights but seems grotesquely ignorant of basic facts about its genesis–for example, that the famous Catholic-Thomistic philosopher Jacques Maritain was one of its principal authors.

This ignorance extends systematically to the role of religion in every progressive social and political movement since the time of the Revolution (including the Revolution), Abolition, women’s rights, civil rights, poverty alleviation, and economic and environmental activism.  In fact, in its anti-theistic fervor, it is difficult to imagine a cause or movement so embarrassingly mistaken about the factors of cultural change as so-called ‘secular’ humanism. It is equally difficult to locate a movement more craven in its lack of serious accomplishments in any of the areas it professes to care about:  Baptists and Quakers did more for free speech. Unitarians more for secularism and education (think, Harvard) and Catholics more for the poor and for building schools and hospitals.  Add the Jews, the African American Church and a few other liberal denominations that the secular humanists never mention and you have roughly a capsule of America.  Humanism was never irrelevant before ”secular humanism” made it irrelevant in marginalizing it from the great social and political ideas of the time in favor of a crabbed and jaundiced view of religion in general.  In a word, “secular humanism” has been disastrous on almost every front, but primarily in robbing humanism of its pedigree as a light in the darkness.

Just off the boat

.

***

Basically the history of humanism is a story of cultic emanations from original purposes.  The term itself had some currency in the Renaissance and perhaps its finest early articulation is Pico della Mirandola’s Oration on the Dignity of Man. Anyone who has read the turgid twentieth century humanist “manifestos” and has not read Pico should be deeply ashamed. But, simply, the denotative meaning  of the term came to be “learning.”

Critics and reformers like Erasmus, Luther, Zwingli, Calvin and even an intellectually pretentious monarch like the second Henry Tudor could lay claim to the name as easily as Galileo, Boccaccio, Macchiaveli, Miguel Servetus,  Marsilio Ficino, Petrarch, Montaigne, a significant number of popes (Pius II, Sixtus II and Leo X) and members of the Roman curia.  This humanism was decisively not secular, not atheistic, and not very democratic and progressive.  Its models were largely situated in antiquity and the new “science” of philology.  Its first great victory was Lorenzo Valla’s discovery that the “Donation of Constantine,” thought to confer unlimited powers of government on the bishop of Rome, was a medieval forgery.

But it was rationalistic: The Cambridge History of Philosophy peals,  “Here, one felt no weight of the supernatural pressing on the human mind, demanding homage and allegiance. Humanity—with all its distinct capabilities, talents, worries, problems, possibilities—was the center of interest. It has been said that medieval thinkers philosophized on their knees, but, bolstered by the new studies, they dared to stand up and to rise to full stature.”  I am far from agreeing with the characterisation; but it isn’t all wrong.

It is difficult in our century, on the other side of this linguistic narrowing, to imagine the time when humanism was confident enough to incorporate the religious imagination in its understanding of man, nature and society rather than seeking to exclude it from the picture as a long history of unreason and error to which the human animal was prone prior to the Enlightenment, that magical period when (it’s unhistorically alleged) the human race woke from its long superstitious religious slumber.

The Enlightenment Myth: Humanism and Humeanism

Of course we never really woke up because we were never sound asleep.  The idea that we were comes from epoch-making historiographers (like the flatulent but always amusing Gibbon) of the 18th century who saw the Renaissance as too garish, the middle ages too dark, and everything prior to that except a few classical philosophers as too superstitious.

This jaded Humean perspective (no one had ever been more wrong, more convincing, nor more influential about the origins of monotheism) is what, in dilute form, defined the idea of progress and the various scientific materialisms that climaxed in Darwin and his explanatory template—one that fed social theory, psychology and the sciences for the next century and a half, in various ways,  and one that  substituted know-how for belief-in.

A 2009 article in the Economist suggests,

There has long been a tension between seeking perfection in life or in the afterlife. Optimists in the Enlightenment and the 19th century came to believe that the mass of humanity could one day lead happy and worthy lives here on Earth. Like Madach’s Adam, they were bursting with ideas for how the world might become a better place…. Some thought God would bring about the New Jerusalem, others looked to history or evolution. Some thought people would improve if left to themselves, others thought they should be forced to be free; some believed in the nation, others in the end of nations; some wanted a perfect language, others universal education; some put their hope in science, others in commerce; some had faith in wise legislation, others in anarchy. Intellectual life was teeming with grand ideas. For most people, the question was not whether progress would happen, but how….The idea of progress forms the backdrop to a society. In the extreme, without the possibility of progress of any sort, your gain is someone else’s loss. If human behaviour is unreformable, social policy can only ever be about trying to cage the ape within. Society must in principle be able to move towards its ideals, such as equality and freedom, or they are no more than cant and self-delusion. So it matters if people lose their faith in progress. And it is worth thinking about how to restore it.”[i]

Humanism, however, was not inherently “progressive.” The scientistic form of the idea of progress inherited from the 18th century was inherently uncritical–and still, as scientific naturalism, largely is.

While the reasons for its distrust of “progress” are complex, they extend to the Church’s claim of “continuing revelation” and doctrinal development as part of an organic evolution in Christianity. The history of their era had created in the early humanists a deep distrust of so-called development: “progress” and evolutionary processes were things to be examined, inquired into, deconstructed, not respected.  In fact its earliest achievements were conservative, or at least restorative, and focused on ideas, forms, texts and institutions—especially the Church–that had aged badly and were considered, in various degrees, corrupt.

Only through a generous application of the term generous has humanism been understood as a partisan movement for championing whatever sacred cows happen to be grazing in the trendy pastures of interest groups.  As a “spirit”– long before the term Zeitgeist came to inhabit the intellectual world after Hegel–humanism was a touchstone that could invalidate as easily as it could inspire progressive ideologies—part of the reason for both the late-Marxist and early Heideggeran discomfort with the word and attempts to reform it. In colloquial terms, humanism was nobody’s baby.

In fact, the essential impulses of humanism were somewhat puritanical, as in the original sense, purifying–which is why, in its methods, the humanist approach suited the reformers who saw religion as an inheritance of aggregated errors in text and teaching. They did not form a unified front, however: Humanism was a modality, not a party or a cause. That distinction went to the terms “protestant” and Catholic.”  Its insistence on criticism and the authority of the human intellect was not abstract but concrete:  Neither Calvin nor Montaigne in their different spheres believed in the unaided, untaught or unformed “reason” of the common man–and except for a few romantics like Rousseau (who never met an English Baptist or a North Carolina Methodist), no one in the Enlightenment did either.

Ironically, this skepticism about the “availability” of reason fit perfectly with the Church’s traditional teaching about the fall of man being essentially proved by his mulish stupidity.  It was one of the reasons the Church’s relatively well-educated hierarchy insisted on the authority of a magisterium, a teaching authority over the common man. And who, who has witnessed the American electoral process at work, would say that the Church was wrong?

A Very Little History

Much of “secular” humanism’s complaint about ninnyhammer fundamentalists is simply a remnant of the belief that not everyone enjoys the same capacity to reason—an idea that extends from Aristotle to Bertrand Russell and Avitall Ronnell .  Humanism has always depended on an élite because since the beginning it has placed human intelligence at the center of its vision of the world.  It is at its worst in the definition wars:  a “democratic” movement, an “ethical” worldview, a “progressive” life-stance, a global vision. Whatever contribution the humanist modality has made to these areas of life and interest, they do not add up to whole cloth  humanism. And the various “humanist manifestos” have been little short of thievery in eviscerating the term of its modal power and turning a spirit into slogans, banal aphorisms, and more recently billboards:  You can be good without God.  You don’t need God to be loved.  No God, no problem.  I suppose it would be irrelevant to the proponents of this insipidity that humanism could not have done what it managed to do if it had begun with these proposals.

In fact, the attempts of movement-humanists to flip the meaning of the term has been degenerative, conceptually sloppy, and subversive.  Like many linguistic changes however, the mutually contradictory attempts to redefine and reclaim the term have been based on a wrong understanding of what a limited and informed understanding of humanism might entail.

Although “humanism” is considered nascent in the classical period, it was only descriptively used in an antiquarian sense when Georg Voigt employed it in 1856 to describe the classical learning of the Renaissance.

Much more significant was its use by the “father” of cultural historians, Jacob Burckhardt as a moment (Augenblick) when …

“both sides of human consciousness – the side turned to the world and that turned inward – lay, as it were, beneath a common veil, dreaming or half awake. The veil was woven of faith, childlike prejudices, and illusion; seen through it, world and history appeared in strange hues; man recognized himself only as a member of a race, a nation, a party, a corporation, a family, or in some other general category. It was in Italy that this veil first melted into thin air, and awakened an objective perception and treatment of the state and all things of this world in general; but by its side, and with full power, there also arose the subjective; man becomes a self-aware individual and recognises himself as such.”

That is, humanism was that moment when “humanity” became self-aware—and why, incidentally, the story of Adam becomes vitally important to thinkers like Pico and Madach: as the expulsion from Eden is not seen as a fall of man but as the rise of human responsibility. It paved the way for enormous changes in the university schools of the 17th and 18th century leading finally to Bacon’s Novum Organum and the rejection of tradition (and traditional teachers like Aristotle) as the font of all wisdom.

Without these changes, the Enlightenment would have been a flick in the dark.  But the key point is that humanism did not as such align itself with causes.  It remained, strictly speaking, the property of the philosophers, “literary” men and women (literae humaniores–humane letters—another name for classics), and, as the word became popular, men of science.

Humanism was not the sum of the socially progressive movements that learning made possible; it was the learning and impulse that created cultural balance and platonic “justice” in a systematic and speciesized form.  The early humanists would have declared that learning is the counterweight to all claims of authority and all forms of activism used in favor of (or against) such authority: the Catholic church of their day would have been interchangeable with the “progressive” social and economic regimes of the twentieth century.

As a word, humanism has been associated with everything from anarchy (Prudhomme) to the cult of feeling (Renan) to opposition to organized religion (young Marx and the left Hegelians).  The fabriquet “secular humanism” to mean a humanism stripped of religious affections and committed to the propagation of “democratic values” and ethical ideals is perhaps the most crippled attempt to sell shreds as cloth or own the baby.

It is especially noxious, however as a usurpation of the term that advocates humanism being closely identified with “secularism” and “non-theism.”  Its narrow focus on the practice of science and the use of “reason”–whatever that term is thought to mean–has achieved such hyperbolic and absurd levels that one could be forgiven for wondering why the term humanism (as opposed to atheosecularism, for example) is used at all.

As an historical linguist, I think I know the answer: it’s the desire for prestige-value fueled by what’s known as morphological plasticity—as when Congressmen appeal to the eight lone friends still listening to them as “the American People.”

The effect of the subversion of the idea of humanism by the atheosecularists has been to create a three-headed dog, defined primarily by (a) an American context, specifically identified with native religious yahooism;  (b) the endorsement of the “universal” relevance of certain slogans associated with American political culture—especially “democracy,” “free speech” and “secularism”; and (c) under the banner of “reason,” the imposition of a naturalistic or atheist framework and a superstitious (and unproblematised view) of the Enlightenment, Darwin and his successors, and scientific progress.  If one wanted to use the periphrasis “real lady” for prostitute, this is the linguists’ ideal example.

From this beast, the most avid secular humanists profess to have derived an ethics–applicable, naturally, to the whole world (not in His hands) but unsurprisingly a little sketchy in particulars.

Secular humanist ethics obliges the human-valuer only to believe in the first three principles and act accordingly.  Erasmus, says Dutch cheese.

Linguistically, I am not an “originalist.”  But I do believe that words, like movements, can be subverted–not only by scoundrels in search of respectability but even even by well-intentioned users.

The Duke and the Dauphin: British and French royalty on the Mississippi

There is no apostolic succession of meaning to the word humanism.  Yet there is a recent history of abuse, misappropriation, and a concept that has been subverted by contempt or ignorance of historical meaning. Humanism is not a freight train by any other name. It cannot mean what Humpty wants it to mean and nothing else (“With a name like yours” [he said to Alice] “you might be any shape, almost.”) Humanism is not an Alice.  It is more like an egg.

Some words–noble words,–should enjoy a peaceful life.  They should be left alone to mean what they mean rather than what word-starved men and women want them to mean in the service of private causes.  The use of the term humanism by secular humanists is its use by scoundrels in search of a non-emotive word for unbelief.  But humanism has never been about unbelief, let alone about the sort of unbelief that contemporary secular humanism espouses.  It has always been about belief in a human spirit that rises above even discredited ideas of God and government.

Blessed are the legend-makers with their rhyme

of things not found within recorded time. 

It is not they that have forgot the Night, 

or bid us flee to organized delight, 

in lotus-isles of economic bliss

forswearing souls to gain a Circe-kiss

(and counterfeit at that, machine-produced, 

bogus seduction of the twice-seduced) – JRR Tolkien


__________________________________

[i] THE best modern parable of progress was, aptly, ahead of its time. In 1861 Imre Madach published “The Tragedy of Man”, a “Paradise Lost” for the industrial age. The verse drama, still a cornerstone of Hungarian literature, describes how Adam is cast out of the Garden with Eve, renounces God and determines to recreate Eden through his own efforts. “My God is me,” he boasts, “whatever I regain is mine by right. This is the source of all my strength and pride.”

Adam gets the chance to see how much of Eden he will “regain”. He starts in Ancient Egypt and travels in time through 11 tableaux, ending in the icebound twilight of humanity. It is a cautionary tale. Adam glories in the Egyptian pyramids, but he discovers that they are built on the misery of slaves. So he rejects slavery and instead advances to Greek democracy. But when the Athenians condemn a hero, much as they condemned Socrates, Adam forsakes democracy and moves on to harmless, worldly pleasure. Sated and miserable in hedonistic Rome, he looks to the chivalry of the knights’ crusader. Yet each new reforming principle crumbles before him. Adam replaces 17th-century Prague’s courtly hypocrisy with the rights of man. When equality curdles into Terror under Robespierre, he embraces individual liberty—which is in turn corrupted on the money-grabbing streets of Georgian London. In the future a scientific Utopia has Michelangelo making chair-legs and Plato herding cows, because art and philosophy have no utility. At the end of time, having encountered the savage man who has no guiding principle except violence, Adam is downcast—and understandably so. Suicidal, he pleads with Lucifer: “Let me see no more of my harsh fate: this useless struggle.”