Play Mythty For Me? Dr Carrier Carries On

It’s considered bad form for an author or editor to reply to critical reviews of their own work. But as Richard Carrier’s recent attempt to trash the Sources of The Jesus Tradition looks more like a fit of distemper than a serious attempt to assess a collection of essays (and hardly represents “my own work”) I think his “review” is fair game.

Sources, like a lot of anthologies, is an untidy book.  That’s not my phrase; it’s one I learned from looking at reviews of two significant twentieth- century collections–a famous one by Hans Bartsch called Kerygma and Myth, consisting of scattered and not very well focused responses to Rudolph Bultmann’s classic essay The New Testament and Mythology, and a real second-rater, undeservedly famous, called The Myth of God Incarnate.  In fact, it was the editor of the latter, John Hick, who called it untidy.

I’d probably describe each anthology as one or two good essays surrounded by clutter and private opinion.  Most scholars of any experience know that “collections” and anthologies have a very low batting average in terms of popular success and none at all in terms of financial success.  The corollary is that no editor ever became famous on the basis of editing other people’s work, nor probably totally reprobate either.  Richard Carrier wants it to be otherwise.

Anthologies are untidy because unless the contributors agree on every point or disagree on a defined set of them, the essays tend to wander over the predilections of the essayists. Meeting and conference papers are especially notorious in this respect, editor-driven themed collections much less so.

Sources emanated from a couple of conferences associated with an initiative called the Jesus Project, about which I‘ve written far too much. Carrier was invited to become a part of this initiative a few years ago after its “founding” at UC Davis in 2007 and just prior to its suspension by the host organization, the Center for Inquiry, for which he now works, apparently as an advocate,  in 2009.*

Carrier was originally enthusiastic about the aims, even about my leadership.    He now says that on the basis of post-publication (!) conversations he had with me, “Hoffmann was a complete dick to me, and wouldn’t own up even to the mistakes I had actual proof he had made. Rumor has it he’s like this. But this was my first experience of it. His behavior toward me leaves me with no further sympathy for him, so here it goes….” What “goes” is a cyclone of aspersion that even in the sections where his sentences parse looks like the verbiage of an under-trained enthusiast.  (As an aside, New Testament scholarship is getting a lot of amateurs lately, most of them under-trained).

I am still not sure what “mistakes” he’s referring to other than his own, which were as substantial as his contribution was irrelevant, a long discursus on Bayes’ theorem that never once budges above pedantic lecturing to engage the literary material – the New Testament – to which its application is implied to be relevant. A cautious, or less sympathetic editor would have cut it eo ipso as being totally to the left of the topic, though Carrier shows a fleeting acquaintance with some of the methods (and limits) of conventional New Testament criticism.  It does not rise to the level of convincing expertise.

The other essays Carrier finds worthwhile, indeed redemptive, are the contributions of Frank Zindler, head of the American Atheist Press, and Ron Lindsay, head of the organization that employs him (and one suspects, the organization at whose bidding he’s doing this hatchet job). Needless to say, he feels his own essay belongs to this lot.

What these three contributions have in common is that their authors share the conviction that Jesus did not exist.  That’s a fair conclusion, as I have said on several occasions, and one of the areas the Jesus Project was meant to address.  Of the three, Zindler comes closest, tonally, to “fitting” in with the essays Carrier would like to rip away, especially my own. Lindsay on the other hand writes a fairly anachronistic piece using the formulations of modern American jurisprudence as basis for deciding questions of “evidence” in the gospels.  But while naive, it at least (to quote the author) discharges its duty to the subject matter, unlike Carrier’s piece where the subject matter never comes into view.     To be generous, it may be largely the writer’s own sense of the deficiency of his performance that leads him to accuse me of sloppy editing.  There is a lot an editor can do to ensure that an article or chapter is an accurate representation of what its author intended it to say.  There is virtually nothing an editor can do to make an article rewrite itself once it’s been written.

Thomas Bayes. Maybe.

Carrier also claims that my own public presentation at the conference does not correspond to what I have included in the book.  As a matter of fact, “On Not Finding the Historical Jesus”  and “The Canonical Historical Jesus” represent the entirety of the handwritten scripts of my presentations at the Amherst conference, edited for publication but not at all substantially different from what was said in 2008.

Whether the essays, meager and merely suggestive as they are, have any merit beyond what Carrier assigns to them, I cannot say.  I can say the “naivety” he curiously assigns to me concerning the origins of the sayings of Jesus, the identity of Paul and (especially) the status of Ephesians reveals a woeful ignorance of my own scholarship in this area, especially in terms of the history of the canon.  Beyond this, what he says is pure tantrum and loaded with the language of a man who strives to be outrageous and appears to be perennially upset.

Do we agree about anything?  Yes, the chapters by Luedemann and Meggitt are very good.  So, however, are the chapters by Trobisch and MacDonald and Chilton. As for Arthur Droge, whose comments at the meeting were also very good, Arthur was not able to get them to me in publishable form before deadline, though a version of his remarks appeared in the journal Caesar, cut by CFI at the same time at the Jesus Project was defunded. As for James Tabor and others, their lectures were not available because they formed part of work already committed to publishers.  They were gracious enough to share their ideas with  the group–as were many others at UC Davis in 2007. I do not think this is unusual, but I recognize that as a full-time self-promotionalist Carrier does not travel an orthodox conference circuit where this protocol would be familiar to him.  He writes primarily for his fans, atheists pre-committed to his view of a mythical Jesus who then pretend to be passionate about evidence and method. Obviously people like me deserve the ire of people like that.

Yet even by my low standards, a 50% rate of good and excellent essays is a “win,” especially since the majority of the losers–my own (3 of 15)–get the axe as “fails.”

Dr. Carrier has spent an extraordinary amount of time and energy trying to separate me out from the group in order to perform a kind of literary assassination, but in a way so crude and bilik that the whole interminable exercise sounds like a whine.

But to recap: The book remains untidy, like a lot of anthologies that begin as conferences and papers.  I wish it could have been tidier. I am guessing, however, that the sore thumb sticking out of the collection in such a way that its author must now wonder what he was doing is an essay entitled “Bayes Theorem for Beginners.”  I certainly wonder what it’s doing there.

__________________________________

*As of April 2011, much of the work of the Jesus Project is subsumed in a new group completely independent of CFI and its agenda. Information concerning The Jesus Prospect is available from its managing director,  S.L. Fisher:  stephanielouisefisher@btinternet.com

Philosophy’s Search for the Immutable

During the lazy month of June New.Ox. offers a random but largely relevant selection of things I like to read and think about.  rjh

John Dewey

[Previously] we noted incidentally the distinction made in the classic tradition between knowledge and belief, or, as Locke put it, between knowledge and judgment. According to this distinction the certain and knowledge are co-extensive. Disputes exist, but they are whether sensation or reason affords the basis of certainty; or whether existence or essence is its object. In contrast with this identification, the very word “belief” is eloquent on the topic of certainty. We believe in the absence of knowledge or complete assurance. Hence the quest for certainty has always been an effort to transcend belief. Now since, as we have already noted, all matters of practical action involve an element of uncertainty, we can ascend from belief to knowledge only by isolating the latter from practical doing and making.

In this chapter we are especially concerned with the effect of the ideal of certainty as something superior to belief upon the conception of the nature and function of philosophy. Greek thinkers saw dearly-and logically-that experience cannot furnish us, as respects cognition of existence, with anything more than contingent probability. Experience cannot deliver to us necessary truths; truths completely demonstrated by reason. Its conclusions are particular, not universal. Not being “exact” they come short of “science.” Thus there arose the distinction between rational truths or, in modern terminology, truths relating to the relation of ideas, and “truths” about matters of existence, empirically ascertained. Thus not merely the arts of practice, industrial and social, were stamped matters of belief rather than of knowledge, but also all those sciences which are matters of inductive inference from observation.

One might indulge in the reflection that they are none the worse for all that, especially since the natural sciences have developed a technique for achieving a high degree of probability and for measuring, within assignable limits, the amount of probability which attaches in particular cases to conclusions. But historically the matter is not so simple as to permit of this retort. For empirical or observational sciences were placed in invidious contrast to rational sciences which dealt with eternal and universal objects and which therefore were possessed of necessary truth. Consequently all observational sciences as far as their material could not be subsumed under forms and principles supplied by rational science shared in the depreciatory view held about practical affairs. They are relatively low, secular and profane compared with the perfect realities of rational science.

And here is a justification for going back to something as remote in time as Greek philosophy. The whole classic tradition down to our day has continued to hold a slighting view of experience as such, and to hold up as the proper goal and ideal of true knowledge realities which even if they are located in empirical things cannot be known by experimental methods. The logical consequence for philosophy itself is evident. Upon the side of method, it has been compelled to claim for itself the possession of a method issuing from reason itself, and having the warrant of reason, independently of experience. As long as the view obtained that nature itself is truly known by the same rational method, the consequences at least those which were evident-were not serious. There was no break between philosophy and genuine science-or what was conceived to be such. In fact, there was not even a distinction there were simply various branches of philosophy, metaphysical, logical, natural, moral, etc., in a descending scale of demonstrative certainty. Since, according to the theory, the subject-matter of the lower sciences was inherently of a different character from that of true knowledge, there was no ground for rational dissatisfaction with the lower degree of knowledge called belief. Inferior knowledge or belief corresponded to the inferior state of subject-matter.

The scientific revolution of the seventeenth century effected a great modification. Science itself through the aid of mathematics carried the scheme of demonstrative knowledge over to natural objects. The “laws” of the natural world had that fixed character which in the older scheme had belonged only to rational and ideal forms. A mathematical science of nature couched in mechanistic terms claimed to be the only sound natural philosophy. Hence the older philosophies lost alliance with natural knowledge and the support that had been given to philosophy by them. Philosophy in maintaining its claim to be a superior form of knowledge was compelled to take an invidious and so to say malicious attitude toward the conclusions of natural science. The framework of the old tradition had in the meantime become embedded in Christian theology, and through religious teaching was made a part of the inherited culture of those innocent of any technical philosophy. Consequently, the rivalry between philosophy and the new science, with respect to the claim to know reality, was converted in effect into a rivalry between the spiritual values guaranteed by the older philosophic tradition and the conclusions of natural knowledge. The more science advanced the more it seemed to encroach upon the special province of the territory over which philosophy had claimed jurisdiction. Thus philosophy in its classic form became a species of apologetic justification for belief in an ultimate reality in which the values which should regulate life and control conduct are securely enstated.

There are undoubted disadvantages in the historic manner of approach to the problem which has been followed. It may readily be thought either that the Greek formulation which has been emphasised has no especial pertinency with respect to modern thought and especially to contemporary philosophy; or that no philosophical statement is of any great importance for the mass of non-philosophic persons. Those interested in philosophy may object that the criticisms passed are directed if not at a man of straw at least to positions that have long since lost their actuality. Those not friendly to any form of philosophy may inquire what import they have for any except professed philosophers.

The first type of objection will be dealt with somewhat in extenso in the succeeding chapter, in which I shall try to show how modern philosophies, in spite of their great diversity, have been concerned with problems of adjustment of the conclusions of modern science to the chief religious and moral tradition of the western world; together with the way in which these problems are connected with retention of the conception of the relation of knowledge to reality formulated in Greek thought. At the point in the discussion now reached, it suffices to point out that, in spite of great changes in detail, the notion of a separation between knowledge and action, theory and practice, has been perpetuated, and that the beliefs connected with action are taken to be uncertain and inferior to value compared with those inherently connected with objects of knowledge, so that the former are securely established only as they derived from the latter. Not the specific content of Greek thought is pertinent to present problems, but its insistence that security is measured by certainty of knowledge, while the latter is measured by adhesion to fixed and immutable objects, which therefore are independent of what men do in practical activity.

The other objection is of a different sort. It comes from those who feel that not merely Greek philosophy but philosophy in any form is remote from all significant human concern. It is willing to admit or rather assert that it is presumptuous for philosophy to lay claim to knowledge of a higher order than that given by natural science, but it also holds that this is no great matter in any case except for professional philosophers.

There would be force in this latter objection were it not. that those who make it hold for the most part the same philosophy of certainty and its proper object that is held by philosophers, save in an inchoate form. They are not interested in the notion that philosophic thought is a special means of attaining this object and the certainty it affords, but they are far from holding, either explicitly or implicitly, that the arts of intelligently directed action are the means by which security of values are to be attained. With respect to certain ends and goods they accept this idea. But in thinking of these ends and values as material, as related to health, wealth, control of conditions for the sake of an inferior order of consequences, they retain the same division between a higher reality and a lower that is formulated in classic philosophy. They may be innocent of the vocabulary that speaks of reason, necessary truth, the universal, things in themselves and appearances. But they incline to believe that there is some other road than that of action, directed by knowledge, to achieve ultimate security of higher ideals and purposes. They think of practical action as necessary for practical utilities, but they mark off practical utilities from spiritual and ideal values. Philosophy did not originate the underlying division. It only gave intellectual formulation and justification to ideas that were operative in men’s minds generally. And the elements of these ideas are as active in present culture as they ever were in the past. Indeed, through the diffusion of religious doctrines, the idea that ultimate values are a matter of special revelation and are to be embodied in life by special means radically different from the arts of action that deal with lower and lesser ends has been accentuated in the popular mind.

Here is the point which is of general human import instead of concern merely to professional philosophers. What about the security of values, of the things which are admirable, honourable, to be approved of and striven for? It is probably in consequence of the derogatory view held of practice that the ion of the secure place of values in human experience is seldom raised in connection with the problem of the relation of knowledge and practice. But upon any view concerning the status of action, the scope of the latter cannot be restricted to self-seeking acts, nor to those of a prudential aspect, nor in general to things of expediency and what are often termed “utilitarian” affairs. The maintenance and diffusion of intellectual values, of moral excellencies, the aesthetically admirable, as well as the maintenance of order and decorum in human relations are dependent upon what men do.

Whether because of the emphasis of traditional religion salvation of the personal soul or for some other reason, there is a tendency to restrict the ultimate scope of morals to the reflex effect of conduct on one’s self. Even utilitarianism, with all its seeming independence of traditional theology and its emphasis upon the general good as the criterion for judging conduct, insisted in its hedonistic psychology upon private Pleasure as the motive for action. The idea that the stable and expanding institution of all things that make life worth while throughout all human relationships is the real object of all intelligent conduct is depressed from view by the current conception of morals as a special kind of action chiefly concerned with either the virtues or the enjoyments of individuals in their personal capacities. In changed form, we still retain the notion of a division of activity into two kinds having very different worths. The result is the depreciated meaning that has come to be attached to the very meaning of the “practical” and the useful. Instead of being extended to cover all forms of action by means of which all the values of life are extended and rendered more secure, including the diffusion of the fine arts and the cultivation of taste, the processes of education and all activities which are concerned with rendering human relationships more significant and worthy, the meaning of “practical” is limited to matters of ease, comfort, riches, bodily security and police order, possibly health, etc., things which in their isolation from other goods can only lay claim to restricted and narrow value. In consequence, these subjects are handed over to technical sciences and arts; they are no concern of “higher” interests which feel that no matter what happens to inferior goods in the vicissitudes of natural existence, the highest values are immutable characters of the ultimately real.

Our depreciatory attitude toward “practice” would be modified if we habitually thought of it in its most liberal sense, and if we surrendered our customary dualism between two separate kinds of value, one intrinsically higher and one inherently lower. We should regard practice as the only means (other than accident) by which whatever is judged to be honourable, admirable, approvable can be kept in concrete experienceable existence. In this connection the entire import of “morals” would be transformed. How much of the tendency to ignore permanent objective consequences in differences made in natural and social relations; and how much of the emphasis upon personal and internal motives and dispositions irrespective of what they objectively produce and sustain, are products of the habitual depreciation of the worth of action in comparison with forms of mental processes, of thought and sentiment, which make no objective difference in things themselves?

It would be possible to argue (and, I think, with much justice) that failure to make action central in the search for such security as is humanly possible as a survival of the impotency of men in those stages of civilisation when he had few means of regulating and utilising the conditions upon which the occurrence of consequences depend. As long as man was unable by means of the arts of practice to direct the course of events, it was natural for him to seek an emotional substitute; in the absence of actual certainty in the midst of a precarious and hazardous world, men cultivated all sorts of things that would give them the feeling of certainty. And it is possible that, when not carried to an illusory point, the cultivation of the feeling gave man courage and confidence and enabled him to carry the burdens of life more successfully. But one could hardly seriously contend that this fact, if it be such, is one upon which to found a reasoned philosophy.

It is to the conception of philosophy that we come back. No mode of action can, as we have insisted, give anything approaching absolute certitude it provides insurance but no assurance. Doing is always subject to peril, to the danger of frustration. When men began to reflect philosophically it seemed to them altogether too risky to leave the place of values at the mercy of acts the results of which are never sure. This precariousness might hold as far as empirical existence, existence in the sensible and phenomenal world, is concerned; but this very uncertainty seemed to render it the more needful that ideal goods should be shown to have, by means of knowledge of the most assured type, an indefeasible and inexpugnable position in the realm of the ultimately real. So at least we may imagine men to have reasoned. And to-day many persons find a peculiar consolation in the face of the unstable and dubious presence of values in actual experience by projecting a perfect form of good into a realm of essence, if not into a heaven beyond the earthly skies, wherein their authority, if not their existence, is wholly unshakeable.

Instead of asking how far this process is of that compensatory kind with which recent psychology has made us familiar, we are inquiring into the effect upon philosophy. It will not be denied, I suppose, that the chief aim of those philosophies which I have called classical, has been to show that the realities which are the objects of the highest and most necessary knowledge are also endowed with the values which correspond to our best aspirations, admirations and approvals. That, one may say, is the very heart of all traditional philosophic idealisms. There is a pathos, having its own nobility, in philosophies which think it their proper office to give an intellectual or cognitive I certification to the ontological reality of the highest values. It is difficult for men to see desire and choice set earnestly upon the good and yet being frustrated, without their imagining a realm in which the good has come completely to its own, and is identified with a Reality in which resides all ultimate power. The failure and frustration of actual life is then attributed to the fact that this world is finite and phenomenal, sensible rather than real, or to the weakness of our finite apprehension, which cannot see that the discrepancy between existence and value is merely seeming, and that a fuller vision would behold partial evil an element in complete good. Thus the office of philosophy is to project by dialectic, resting supposedly upon self-evident premises, a realm in which the object of completest cognitive certitude is also one with the object of the heart’s best aspiration. The fusion of the good and the true with unity and plenitude of Being thus becomes the goal of classic philosophy.

The situation would strike us as a curious one were it not so familiar. Practical activity is dismissed to a world of low grade reality. Desire is found only where something is lacking and hence its existence is a sign of imperfection of Being. Hence one must go to passionless reason to find perfect reality and complete certitude. But nevertheless the chief philosophic interest is to prove that the essential properties of the reality that is the object of pure knowledge are precisely those characteristics which have meaning in connection with affection, desire and choice. After degrading practical affairs in order to exalt knowledge, the chief task of knowledge turns out to be to demonstrate the absolutely assured and permanent reality of the values with which practical activity is concerned! Can we fall to see the irony in a situation wherein desire and emotion are relegated to a position inferior in every way to that of knowledge, while at the same time the chief problem of that which is termed the highest and most perfect knowledge is taken to be the existence of evil-that is, of desires errant and frustrated?

The contradiction involved, however, is much more than a purely intellectual one-which if purely theoretical would be innocuously lacking in practical consequences. The thing which concerns all of us as human beings is precisely the greatest attainable security of values in concrete existence. The thought that the values which are unstable and wavering in the world in which we live are eternally secure in a higher realm (which reason demonstrates but which we cannot experience), that all the goods which are defeated here are triumphant there, may give consolation to the depressed. But it does not change the existential situation in the least. The separation that has been instituted between theory and practice, with its consequent substitution of cognitive quest for absolute assurance for practical endeavour to make the existence of good more secure in experience, has had the effect of distracting attention and diverting energy from a task whose performance would yield definite results.

The chief consideration in achieving concrete security of values lies in the perfecting of methods of action. More activity, blind striving, gets nothing forward. Regulation of conditions upon which results depend is possible only by doing, yet only by doing which has intelligent direction, which take cognisance of conditions, observes relations of sequence, and which plans and executes in the light of this knowledge. The notion that thought, apart from action, can warrant complete certitude as to the status of supreme good, makes no contribution to the central problem of development of intelligent methods of regulation. It rather depresses and deadens effort in that direction. That is the chief indictment to be brought against the classic philosophic tradition. Its import raises the question of the relation which action sustains to knowledge in fact, and whether the quest for certainty by other means than those of intelligent action does not mark a baneful diversion of thought from its proper office. It raises the question whether mankind has not now achieved a sufficient degree of control of methods of knowing and of the arts of practical action so that a radical change in our conceptions of knowledge and practice is rendered both possible and necessary.

That knowing, as judged from the actual procedures of scientific inquiry, has completely abandoned in fact the traditional separation of knowing and doing, that the experimental procedure is one that installs doing as the heart of knowing, is a theme that will occupy our attention in later chapters. What would happen to philosophy if it whole-heartedly made a similar surrender? What would be its office if it ceased to deal with the problem of reality and knowledge at large? In effect, its function would be to facilitate the fruitful interaction of our cognitive beliefs, our beliefs resting upon the most dependable methods of inquiry, with our practical beliefs about the values, the ends and purposes, that should control human action in the things of large and liberal human import.

Such a view renounces the traditional notion that action is inherently inferior to knowledge and preference for the fixed over the changing; it involves the conviction that security attained by active control is to be more prized than certainty in theory. But it does not imply that action is higher and better than knowledge, and practice inherently superior to thought. Constant and effective interaction of knowledge and practice is something quite different from an exaltation of activity for its own sake. Action, when directed by knowledge, is method and means, not an end. The aim and end is the securer, freer and more widely shared embodiment of values in experience by means of that active control of objects which knowledge alone makes possible. [In reaction against the age-long depreciation of practice in behalf of contemplative knowledge, there is a temptation simply to turn things upside down. But the essence of pragmatic instrumentalism is to conceive of both knowledge and practice as means of making good excellencies of all kinds – secure in experienced existence.]

From this point of view, the problem of philosophy concerns the interaction of our judgments about ends to be sought with knowledge of the means for achieving them. just as in science the question of the advance of knowledge is the question of what to do, what experiments to perform, what apparatus to invent and use, what calculations to engage in, what branches of mathematics to employ or to perfect, so theproblem of practice is what do we need to know, how shall we obtain that knowledge and how shall we apply it?

It is an easy and altogether too common a habit to confuse a personal division of labor with an isolation of function and meaning. Human beings as individuals tend to devote themselves either to the practice of knowing or to the practice of a professional, business, social or aesthetic art. Each takes the other half of the circle for granted. Theorists and practitioners, however, often indulge in unseemly wrangles as to the importance of their respective tasks. Then the personal difference of callings is hypostatised and made into an intrinsic difference between knowledge and practice.

If one looks at the history of knowledge, it is plain that at the beginning men tried to know because they had to do so in order to live. In the absence of that organic guidance given by their structure to other animals, man had to find out what he was about, and he could find out only by studying the environment which constituted the means, obstacles and result of his behaviour. The desire for intellectual or cognitive understanding had no meaning except as a means of obtaining greater security as to the issues of action. Moreover, even when after the coming of leisure some men were enabled to adopt knowing as their special calling or profession, merely theoretical uncertainty continues to have no meaning.

This statement will arouse protest. But the reaction against the statement will turn out when examined to be due to the fact that it is so difficult to find a case of purely intellectual uncertainty, that is one upon which nothing hangs. Perhaps as near to it as we can come is in the familiar story of the Oriental potentate who declined to attend a horse race on the ground that it was already well known to him that one horse could run faster than another. His uncertainty as to which of several horses could out-speed the others may be said to have been purely intellectual. But also in the story nothing depended from it; no curiosity was aroused; no effort was put forth to satisfy the uncertainty. In other words, he did not care; it made no difference. And it is a strict truism that no one would care about any exclusively theoretical uncertainty or certainty. For by definition in being exclusively theoretical it is one which makes no difference anywhere.

Revulsion against this proposition is a tribute to the fact that actually the intellectual and the practical are so closely bound together. Hence when we imagine we are thinking of an exclusively theoretical doubt, we smuggle in unconsciously some consequence which hangs upon it. We think of uncertainty arising in the course of an inquiry; in this case, uncertainty until it is resolved blocks the progress of the inquiry – a distinctly practical affair, since it involves conclusions and the of producing them. If we had no desires and no purposes, then, as sheer truism, one state of things would be as good as any other. Those who have set such store by the demonstration that Absolute Being already contains in eternal safety within itself all values, have had as their interest the fact that while the demonstration would make no difference in the concrete existence of these values – unless perhaps to weaken effort to generate and sustain them – it would make no difference in their own personal attitudes – in a feeling of comfort or of release from responsibility, the consciousness of a “moral holiday” in which some philosophers have found the distinction between morals and religion.

Such considerations point to the conclusion that the ultimate ground of the quest for cognitive certainty is the need for security in the results of action. Men readily persuade themselves that they are devoted to intellectual certainty for its own sake. Actually they want it because of its bearing on safeguarding what they desire and esteem. The need for protection and prosperity in action created the need for warranting the validity of intellectual beliefs.

After a distinctively intellectual class had arisen, a class having leisure and in a large degree protected against the more serious perils which afflict the mass of humanity, its members proceeded to glorify their own office. Since no amount of pains and care in action can ensure complete certainty, certainty in knowledge was worshipped as a substitute. In minor matters, those that are relatively technical, professional, “utilitarian,” men continued to resort to improving their methods of operation in order to be surer of results. But in affairs of momentous value the requisite knowledge is hard to come by and the bettering of methods is a slow process to be realised only by the cooperative endeavour of many persons. The arts to be formed and developed are social arts; an individual by himself can do little to regulate the conditions which will render important values more secure, though with shrewdness and special knowledge he can do much to further his own peculiar aims – given a fair share of luck. So because of impatience and because, as Aristotle was given to pointing out, an individual is self-sufficient in that kind of thinking which involves no action, the ideal of a cognitive certainty and truth having no connection with practice, and prized because of its lack of connection, developed. The doctrine worked out practically so as to strengthen dependence upon authority and dogma in the things of highest value, while increase of specialised knowledge was relied upon in everyday, especially economic, affairs. Just as belief that a magical ceremony will regulate the growth of seeds to full harvest stifles the tendency to investigate natural causes and their workings, so acceptance of dogmatic rules as bases of conduct in education, morals and social matters, lessens the impetus to find out about the conditions which are involved in forming intelligent plans.

It is more or less of a commonplace to speak of the crisis which has been caused by the progress of the natural sciences in the last few centuries. The crisis is due, it is asserted, to the incompatibility between the conclusions of natural science about the world in which we live and the realm of higher values, of ideal and spiritual qualities, which get no support from natural science. The new science, it is said, has stripped the world of the qualities which made it beautiful and congenial to men; has deprived nature of all aspiration towards ends, all preference for accomplishing the good, and presented nature to us as a scene of indifferent physical particles acting according to mathematical and mechanical laws.

This effect of modern science has, it is notorious, set the main problems for modern philosophy. How is science to be accepted and yet the realm of values to be conserved? This question forms the philosophic version of the popular conflict of science and religion. Instead of being troubled about the inconsistency of astronomy with the older religious beliefs about heaven and the ascension of Christ, or the differences between the geological record and the account of creation in Genesis, philosophers have been troubled by the gap in kind which exists between the fundamental principles of the natural world and the reality of the values according to which mankind is to regulate its life.

Philosophers, therefore, set to work to mediate, to find some harmony behind the apparent discord. Everybody knows that the trend of modern philosophy has been to arrive at theories regarding the nature of the universe by means of theories regarding the nature of knowledge-a procedure which reverses the apparently more judicious method of the ancients in basing their conclusions about knowledge on the nature of the universe in which knowledge occurs. The “crisis” of which we have just been speaking accounts for the reversal.

Since science has made the trouble, the cure ought to -be found in an examination of the nature of knowledge, of the conditions which make science possible. If the conditions of the possibility of knowledge can be shown to be of an ideal and rational character, then, so it has been thought, the loss of an idealistic cosmology in physics can be readily borne. The physical world can be surrendered to matter and mechanism, since we are assured that matter and mechanism have their foundation in immaterial mind. Such has been the characteristic course of modern spiritualistic philosophies since the time of Kant; indeed, since that of Descartes, who first felt the poignancy of the problem involved in reconciling the conclusions of science with traditional religious and moral beliefs.

It would presumably be taken as a sign of extreme naïveté if not of callous insensitiveness, if one were to ask why all this ardour to reconcile the findings of natural science with the validity of values? Why should any increase of knowledge seem like a threat to what we prize, admire and approve? Why should we not proceed to employ our gains in science to improve our judgments about values, and to regulate our actions so as to make values more secure and more widely shared in existence?

I am willing to run the risk of charge of naïveté for the sake of making manifest the difference upon which we have been dwelling. If men had associated their ideas about values with practical activity instead of with cognition of antecedent Being, they would not have been troubled by the findings of science. They would have welcomed the latter. For anything ascertained about the structure of actually existing conditions would be a definite aid in making judgments about things to be prized and striven for more adequate, and would instruct us as to the means to be employed in realising them. But according to the religious and philosophic tradition of Europe, the valid status of all the highest values, the good, true and beautiful, was bound up with their being properties of ultimate and supreme Being, namely, God. All went well as long as what passed for natural science gave no offence to this conception. Trouble began when science ceased to disclose in the objects of knowledge the possession of any such properties. Then some roundabout method had to be devised for substantiating them.

The point of the seemingly crass question which was asked is thus to elicit the radical difference made when the problem of values is seen to be connected with the problem of intelligent action. If the validity of beliefs and judgments about values is dependent upon the consequences of action undertaken in their behalf, if the assumed association of values with knowledge capable of being demonstrated apart from activity, is abandoned, then the problem of the intrinsic relation of science to value is wholly artificial. It is replaced by a group of practical problems: How shall we employ what we know to direct the formation of our beliefs about value and how shall we direct our practical behaviour so as to test these beliefs and make possible better ones? The question is seen to be just what it has always been empirically: What shall we do to make objects having value more secure in existence? And we approach the answer to the problem with all the advantages given us by increase of knowledge of the conditions and relations under which this doing must proceed.

But for over two thousand years the weight of the most influential and authoritatively orthodox tradition of thought has been thrown into the opposite scale. It has been devoted to the problem of a purely cognitive certification (perhaps by revelation, perhaps by intuition, perhaps by reason) of the antecedent immutable reality of truth, beauty and goodness. As against such a doctrine, the conclusions of natural science constitute the materials of a serious problem. The appeal has been made to the Court of Knowledge and the verdict has been adverse. There are two rival systems that must have their respective claims adjusted. The crisis in contemporary culture, the confusions and conflicts in it, arise from a division of authority. Scientific inquiry seems to tell one thing, and traditional beliefs about ends and ideals that have authority over conduct tell us something quite different. The problem of reconciliation arises and persists for one reason only. As long as the notions persist that knowledge is a disclosure of reality, of reality prior to and independent of knowing, and that knowing is independent of a purpose to control the quality of experienced objects, the failure of natural science to disclose significant values in its objects will come as a shock. Those seriously concerned with the validity and authority of value will have a problem on their hands. As long as the notion persists that values are authentic and valid only on condition that they are properties of Being independent of human action, as long as it is supposed that their right to regulate action is dependent upon their being independent of action, so long there will be needed schemes to prove that values are, in spite of the findings of science, genuine and known qualifications of reality in itself. For men will not easily surrender all regulative guidance in action. If they are forbidden to find standards in the course of experience they will seek them somewhere else, if not in revelation, then in the deliverance of a reason that is above experience.

This then is the fundamental issue for present philosophy. Is the doctrine justified that knowledge is valid in the degree in which it is a revelation of antecedent existences or Being? Is the doctrine justified that regulative ends and purposes have validity only when they can be shown to be properties belonging to things, whether as existences or as essences, apart from human action? It is proposed to make another start. Desires, affections, preferences, needs and interests at least exist in human experience; they are characteristics of it. Knowledge about nature also exists. What does this knowledge imply and entail with respect to the guidance of our emotional and volitional life? How shall the latter lay hold of what is known in order to make it of service?

These latter questions do not seem to many thinkers to have the dignity that is attached to the traditional problems of philosophy. They are proximate questions, not ultimate. They do not concern Being and Knowledge “in themselves’ and at large, but the state of existence at specified times and places and the state of affection, plans and purposes under concrete circumstances. They are not concerned with framing a general theory of reality, knowledge and value once for all, but with finding how authentic beliefs about existence as they currently exist can operate fruitfully and efficaciously in connection with the practical problems that are urgent in actual life.

In restricted and technical fields, men now proceed unhesitatingly along these lines. In technology and the arts of engineering and medicine, men do not think of operating in any other way. Increased knowledge of nature and its conditions does not raise the problem of validity of the value of health or of communication in general, although it may well make dubious the validity of certain conceptions men in the past have entertained about the nature of health and communication and the best ways of attaining these goods in fact.

In such matters, science has placed in our hands the means by which we can better judge our wants, and has aided in forming the instruments and operations by which to satisfy them. That the same sort of thing has not happened in the moral and distinctly humane arts is evident. Here is a problem which might well trouble philosophers.

Why have not the arts which deal with the wider, more generous, more distinctly humane values enjoyed the release and expansion which have accrued to the technical arts? Can it be seriously urged that it is because natural science has disclosed to us the kind of world which it has disclosed? It is easy to see that these disclosures are hostile to some beliefs about values which have been widely accepted, which have prestige, which have become deeply impregnated with sentiment, and which authoritative institutions as well as the emotion and inertia of men are slow to surrender. But this admission, which practically enforces itself, is far from excluding the formation of new beliefs about things to be honoured and prized by men in their supreme loyalties of action. The difficulty in the road is a practical one, a social one, connected with institutions and the methods and aims of education, not with science nor with value. Under such circumstances the first problem for philosophy would seem to be to clear itself of further responsibility for the doctrine that the supreme issue is whether values have antecedent Being, while its further office is to make clear the revisions and reconstructions that have to be made in traditional judgments about values. Having done this, it would be in a position to undertake the more positive task of projecting ideas about values which might be the basis of a new integration of human conduct.

We come back to the fact that the genuine issue is not whether certain values, associated with traditions and institutions, have Being already (whether that of existence or of essence), but what concrete judgments we are to form about ends and means in the regulation of practical behaviour. The emphasis which has been put upon the former question, the creation of dogmas about the way in which values are already real independently of what we do, dogmas which have appealed not in vain to philosophy for support, have naturally bred, in the face of the changed character of science, confusion, irresolution and numbness of will. If the men had been educated to think about broader humane values as they have now learned to think about matters which fall within the scope of technical arts, our whole present situation would be very different. The attention which has gone to achieving a purely theoretical certainty with respect to them would have been devoted to perfecting the arts by which they are to be judged and striven for.

Indulge for a moment in an imaginative flight. Suppose that men had been systematically educated in the belief that the existence of values can cease to be accidental, narrow and precarious only by human activity directed by the best available knowledge. Suppose also men had been systematically educated to believe that the important thing is not to get themselves personally “right” in relation to the antecedent author and guarantor of these values, but to form their judgments and carry on their activity on the basis of public, objective and shared consequences. Imagine these things and then imagine what the present situation might be.

The suppositions are speculative- But they serve to indicate the significance of the one point to which this chapter is devoted. The method and conclusions of science have without doubt invaded many cherished beliefs about the things held most dear. The resulting clash constitutes a genuine cultural crisis. But it is a crisis in culture, a social crisis, historical and temporal in character. It is not a problem in the adjustment of properties of reality to one another. And yet modern philosophy has chosen for the most part to treat it as a question of how the realities assumed to be the object of science can have the mathematical and mechanistic properties assigned to them in natural science, while nevertheless the realm of ultimate reality can be characterised by qualities termed ideal and spiritual. The cultural problem is one of definite criticisms to be made and of readjustments to be accomplished. Philosophy which is willing to abandon its supposed task of knowing ultimate reality and to devote itself to a proximate human office might be of great help in such a task. It may be doubted whether it can indefinitely pursue the task of trying to show that the results of science, when they are properly interpreted, do not mean what they seem to say, or of proving, by means of an examination of possibilities and limits of knowledge, that after all they rest upon a foundation congruous with traditional beliefs about values.

Since the root of the traditional conception of philosophy is the separation that has been made between knowledge and action, between theory and practice, it is to the problem of this separation that we are to give attention. Our main attempt will be to show how the actual procedures of knowledge interpreted after the pattern formed by experimental inquiry, cancel the isolation of knowledge from overt action. Before engaging in this attempt, we shall in the next chapter show the extent to which modern philosophy has been dominated by effort to adjust to each other two systems of belief, one relating to the objects of knowledge and the other to objects ideal value.


The Quest for Certainty (1933), publ. Capricorn Books, 1960.and at http://www.marxists.org/reference/subject/philosophy/works/us/dewey.htm

Praying for Osama

Osama bin Laden is dead, and since he is with the fish rather than God, why am I bothering to blog on what some religious nutter  in Florida thinks about his Christian duty?

Borga

The bare bones of the story are this: A West Palm Beach man named Henry Borga has slipped his parish priest ten bucks to include the name of Osama bin Laden in the Mass-intentions for Sunday, May 22nd.   Borga says he doesn’t “admire, sympathize or respect Osama bin Laden” but being a devout Catholic he feels it’s the Christian thing to do and that people need to pray for ‘lost sheep that stray from the flock.’

“I’m asking for forgiveness, mercy and compassion for that miserable criminal,”  says Borga.

The parish priest, candidly named Gavin Badway (who has become fond of the microphones waiting for him when he pretends to scurry away from a small gaggle of reporters), says that it’s Church policy to accept prayer requests for anyone, though he understands that “it’s an emotional issue.” Thank you father.  Tea?

Badway: will pray for anyone.

OK, those are the facts. And Sunday, during the so-called “Prayers of the Faithful,” some doody will say (maybe with her eyebrows slightly raised just to let her family know it wasn’t her idea to do this loser of a thing)  “We pray for those who have died:  Mary Reilly, Joe Vermicelli, Charlie Murphy, and Osama bin Laden.  Eternal rest Grant unto them O Lord. And let perpetual light shine upon them.” (Clears throat).

There will be a mumble, with some people saying “Hear us O Lord,” or “Have mercy on us O lord,” and a hissing few saying  “May the sharks fuck him sideways O Lord.” At the pancake breakfast after Mass, no one will talk to Mr Borga except Mrs Vermicelli who will say, “Thanks a fuckload Chuck for having the world’s worst person mentioned in the same prayer with my Joe.”

Mr Borga will just say that he’s just doing what any good Christian would do. Lost sheep. Flock. Etc.  It’s his WWJD moment.

There are a couple of things wrong with this picture, besides its utter stupidity.  First, Mr Borga has every right to pray for his enemies, in the privacy of his bedroom.  But to insert the name of the world’s nastiest pillock into a liturgical celebration where others can’t opt out, “Christian thing” or not, is really a bad idea. Osama bin Laden is not a lost sheep.  He is a wolf, and the sheep have no obligation to pray for the wolf.

Second, the priest didn’t have to do it.  Although Catholicism is notoriously lax in terms of the theology of this part of the Mass (the “Intentions”) its orginal purpose is pretty limited: the prayers of the Catholic faithful are for the Catholic “faithful” and the “faithful departed”–a spiritual square dance with no strangers buttin’ in trying to steal  a little grace.  The standard boring version includes prayers for the pope, the bishops, priests and priestly vocations, the sick, the dying and the dead plus (sometimes) a shout-out of things that don’t make it onto the list. (“Mel’s grandson in Michigan isn’t doing well. Acute Intermittent Porphyria–I thought you’d want to know”). It’s a prayer uttered by the community for the community.  And while I haven’t checked, I don’t remember OBL being a regular communicant at the Holy Name of Jesus Catholic Church. Rule of thumb, however:  If you don’t pray for your arthritic Baptist uncle George to lay off the gin, it seems a bit of a stretch to pray for a dead Muslim who killed thousands of Christians, Muslims (more Muslims than Christians) and a few Jews before he packed it–I mean had it packed–in.  Aren’t there enough lost sheep in West Palm Beach to make a credible flock? What about Barnes and Noble after midnight?

...Need somebody to love....

Like the Ground Zero Mosque, this is a non-controversy fueled by Catholic illiteracy of the Mass, the Church and its traditions.  Not to pray for Osama bin Laden isn’t a lapse in Christian charity or a sin against the Holy Ghost or a repudiation of Jesus saying love your enemies. Which I suspect he said under enhanced interrogation techniques.

Jesus never met an enemy like this, or he might have made an exception. And as for Christian charity,  there are good reasons why you don’t show it to the devil.  If there’s any larger Catholic purpose here, I can’t see it.  OBL was not going to Catholic heaven, so the intercession of the Church on his behalf is supererogatory and baseless.  If he had dreams of paradise and thought his jihad was the way to get there, he doesn’t need the help of Floridians to locate his virgins among the sharmutahs.

My advice?  Take up a collection and take Mr Borga and his family out for a nice post 11-o’clock mass brunch at the inevitable Olive Garden close to the Church.  Leave a chair empty in honor of  our fallen hero.  Talk to the chair.  Tell it how sorry you are and how Jesus forgives him and that we need to learn to talk together like God’s children.  Offer him him some salad and bread. The wine is all yours.

Is “God” Invulnerable?

Paul Tillich died while I was still in high school. But the embers of his theological revolution–equivalent in theology to Bultmann’s in biblical studies–were still warm by the time I got to Harvard Divinity School, where he taught from 1955 to 1962. I read him assiduously, ran yellow highlighters dry illuminating “key” passages, and wrote the word “Yes!” in the margins more often than Molly Bloom gasps it in the last chapter of Ulysses.

It isn’t that I now regard Tillich as less profound  than I did three decades ago.  It’s that I now realize he was methadone for religion- recoverers. His key works–The Religious Situation, The Shaking of the Foundations, the multipart, unbearably dense Systematic Theology (especially disliked in Britain when it appeared), and Dynamics of Faith–reveal a soul committed to taking the sting out of what many theologians before Tillich called “the modern situation.”

The modern situation was basically scientific knowledge–the growing conviction that what we see is all we get, and that if we can’t see it we just need better techniques for seeing it.  The glaring exception to this optimism, this faith in scientific know-how–a 1950’s word–was God, about whom it was widely supposed that no lens powerful enough, no jet-propulsion engine fast enough and no controlled experiment sophisticated enough was ever going to discover him.  God was safe, in a weird kind of way, because he was, to use the catchphrase of the time, “Wholly Other.”

There were two ways of dealing with the vulnerability of God to the modern situation.  One was to say that God is immune from scientific discovery because he is known only through faith. Bring on your historical criticism, your naturalistic assaults, your so-called “facts,” your rock and roll. The bigness of God just shows the puniness of your methods.  To try this course, however, entailed a repudiation of the idea that God can be known rationally and that faith and reason were compatible rather than hostile modes of determining truth–a rejection, in other words, of the whole previous history of theology, especially Catholic theology.

The other way was to exploit post-positivism, or a theological construction of “Popperism.”  This tactic relied on the philosophical premise that while God can be postulated on reasonable grounds (analogically, for example: shoes have makers so universes have creators) “he” cannot actually be falsified (we know where the shoemaker’s house is; we see him going to it at five o’clock; but we don’t know where God lives as he is thought to be invisible).  We can’t quite be certain that he doesn’t exist, on the same grounds we can’t falsify the existence of anything we haven’t seen, and some propositions (or assertions) about God are tenable, even if implausible, when alternative explanations are considered.

Part of this “propositional” strategy hearkened back to ontology, the idea that God is not directly experienced or instantiated in creation and so in some sense must be greater than it, prior to it, or transcendent, in a way that beggars ordinary description. Theology had never succeeded in reconciling the claim of biblical revelation with the “classical” attributes of god’s aseity and impassibility (i.e., a supreme being cannot change or suffer–“he” is what he is, as Yahweh sniffs in Exodus 3.14), so uncertainty was a kind of safe epistemological cloud to wrap discussion in–in addition to which it had a certain (unrelated) currency in atomic physics which leant it a kind of dubious respectability. This approach preserved the bare notion of the rationality of religious belief, leaving theology room to exploit the doctrine that Christianity is all about faith and hope, the “certainty of things unseen” (Hebrews 11.1).

Faith seeking understanding?

Both positions were so intellectually flimsy (and apologetic) that theologians had to go a long way to create a vocabulary that made them independently and mutually impressive.  That goal, I write to say, was never achieved. Claims were made and games were played, but theology did not succeed in preserving the life of its divine protagonist–not even in the totally cynical and ephemeral God is dead theology of the ‘sixties.

ii

Beginning before the publication of Karl Barth’s “neo-orthodox” tome, The Epistle to the Romans (1922), where the Swiss theologian reaffirms for protestants everywhere the primacy of faith, “serious”  theology became enamoured of the idea that God as God is invulnerable to scientific thought, as the term was understood in the mid-twentieth century.

There were plenty of medieval (and later) parallels to this way of thinking, ranging from mysticism to the “apophatic” theology of some of the scholastics, which even included the acknowledgement that the statement “God exists,” if it means existence of a temporal, durable, knowable kind, is false.

"God does not exist but nothing else matters."

In most areas of life, to say something doesn’t exist means you don’t need to be concerned about it: it can’t bite you or lend you money. In theology, however, this sublime non-existence evoked awe, mystery, dread, and reverence–the very things you don’t get in the morning with coffee and toast. It can even give your own pathetic existence meaning if you just embrace its awesomeness.  Authentically.

Modern discussions of existence as a mere temporal condition of being, especially Heidegger’s, emboldened theologians to think outside the box, with Heidegger being to the thought of the day what Aristotle was to the thirteenth century Church.  Thus Rudolph Bultmann could write this confrontational paragraph in his essay “The New Testament and Mythology” (1941):

The cosmology of the New Testament is essentially mythical in character. The world is viewed as a three storied structure, with the earth in the center, the heaven above, and the underworld beneath. Heaven is the abode of God and of celestial beings — the angels. The underworld is hell, the place of torment. Even the earth is more than the scene of natural, everyday events, of the trivial round and common task. It is the scene of the supernatural activity of God and his angels on the one hand, and of Satan and his demons on the other. These supernatural forces intervene in the course of nature and in all that men think and will and do. Miracles are by no means rare. Man is not in control of his own life. Evil spirits may take possession of him. Satan may inspire him with evil thoughts. Alternatively, God may inspire his thought and guide his purposes. He may grant him heavenly visions. He may allow him to hear his word of succor or demand. He may give him the supernatural power of his Spirit. History does not follow a smooth unbroken course; it is set in motion and controlled by these supernatural powers. This æon is held in bondage by Satan, sin, and death (for “powers” is precisely what they are), and hastens towards its end. That end will come very soon, and will take the form of a cosmic catastrophe. It will be inaugurated by the “woes” of the last time. Then the Judge will come from heaven, the dead will rise, the last judgment will take place, and men will enter into eternal salvation or damnation…”

None of this is literally true–indeed, has already proved not to be true, Bultmann said; none of these things will happen in the way they are described. Called “demythologization,” Bultmann’s program did not call for a simple recognition that (most) modern people find the biblical landscape fantastic and absurd, but an aggressive embrace of methods that would strip mythology away and leave in its place the bare “kerygma”–the message.

Bultmann

While Bultmann could be cagey about the implications of this message,  especially in correspondence with critics like Barth (who refused to accept Bultmann’s defintion of myth) he essentially embraced the axiom of Rudolph Otto (overlaid with Heidegger’s phenomenology) that “God is wholly Other” than the categories we associate with existence.  It was the theological equivalent of hitting the target in front of you and hearing your opponent say, “That isn’t the target you needed to hit.”

     Theologians spent the next forty years coming to terms with the contours (and dead-ends) of Bultmann’s thought.  His contribution to biblical studies was to persuade timid seminarians, accustomed to treating the biblical text with reverence rather than historical skepticism, that in taking a knife to scripture they were not making it bleed away its life.  They were saving it from the cancer of obsolete thoughts and ideas–freeing the message of authentic existence to be itself, making faith a “choice” rather than blind obedience to discredited ideas and dogmas.  Like all closed systems, it made sense from the inside.

While there was much to admire here there was almost no one to admire it: a program for liberal biblical scholars to consider, conservatives to eschew, and almost everyone else to ignore.  Looking back on his legacy from the vantage point of the twenty-first century, it looks strangely like a plant bred only for the hothouse of academic theology and not suited for life in real weather.

The term “demythologization” acquired a voltage among under-read–especially Catholic and evangelical scholars–that was only rivaled by the word “atheism.” Not an elegant prose stylist (most German academic theology of the period was pure fustian) Bultmann was at least considered dangerous in the establishment he was trying to save from intellectual disgrace.

iii

In systematic theology the task was roughly the same, though the tracks did not always run parallel and (perhaps surprisingly) the historical track was often more radical than the theological one as “demythologization” merged with the “hermeneutics of suspicion,” a boutique of approaches that put the biblical text at the mercy of historical criticism.

Tillich in 1957, while still at Harvard, addressed the question of God and the modern situation directly in a Garvin Lecture called “The Idea of God as Affected by Modern Knowledge.”  His key theological slogans are all present in this lecture: God is not a “being,” but the ground of all being–being itself.  All language about God is symbolic rather than realistic, including the meaning of the concept of God–which is not the same as the symbol. It is impossible to describe God or to say anything “non-symbolic” about him.

Like other existentialists Tillich was confronted not just by the problems entailed for theology by God’s non-existence but by the implications of that recognition for human existence itself.  Sartre, among others, had described the sense of emptiness brought on by the end of God’s moral reign as despair, nausea, freedom without purpose. Tillich thought that Christianity’s emphasis on faith was both an acknowledgement that the concept of a literal God was done for  (that is, something implicit in faith itself) but also an opening to being.  In a vocabulary that sometimes rivals Heidegger’s for pure self-indulgence, this is variously described as the “God above god,” “Being itself,” and “ultimate concern.” It is whatever humans regard as sacred, numinous, holy (in traditional language), but so overwhelming that it requires total surrender.  The God of theological theism is no longer the cure but the source of doubt and despair.  He

…deprives me of my subjectivity because he is all-powerful and all-knowing. I revolt and make him into an object, but the revolt fails and becomes desperate. God appears as the invincible tyrant, the being in contrast with whom all other beings are without freedom and subjectivity. He is equated with recent tyrants with the help of terror try to transform everything into a mere object, a thing among things, a cog in a machine they control. He becomes the model of everything against which Existentialism revolted. This is the God Nietzsche said had to be killed because nobody can tolerate being made into a mere object of absolute knowledge and absolute control. This is the deepest root of atheism. It is an atheism which is justified as the reaction against theological theism and its disturbing implication.  (The Courage to Be, 135)

Tillich’s theism was pure humanism in a different and slightly dishonest wrapper.  He confesses as much in his Garvin Lecture when he says that far from science creating the modern situation of universal doubt, it is “the wisdom of twentieth century art, literature, drama and poetry…which reveals man’s predicament: his having to die, his being estranged, his being threatened with the loss of meaning, his becoming an object among other objects” (Idea of God, 108).  God for Tillich is non-objectifiable, thus crumbles when he is made into what the French theologian Gabriel Vahanian called a “cultural artifact,” an idol. Tillich’s theology was at bottom a religious answer to the question Sartre said it was cowardly to answer religiously.

We are already writing the history of post-modernism, and the histories of existentialism are legion.  It’s a history of malaise and post-War exhaustion conceived as a general theory of the “human predicament,” the “modern situation.” Tillich believed that by admitting to the collapse of the literal god-concept, the God of religious authority (an admission that by no means all Christians would have joined him in making!) an epistemological substitute could arise to save us from the mess we have made of our world, our society, our disoriented and alienated selves.  But the distance between a God who could disappear into the vortex (a favourite image of the period) of despair and anxiety and be purified and strengthened by it (Tillich)  and God as “absence, the solitude of man” (Sartre) defined the distance between a reupholstered illusion and the reality that had made atheism an option forced by twentieth century realities. Both thinkers agreed on the non-existence of God.  Yet for Tillich, that was no reason to sacrifice a symbol.

Tillich

The invulnerators were obviously infected with the spirit of their own formative fantasy, the resurrection, which saw the death of the human Jesus as the prelude to his immortal reign.  Christians as Christians clung to a highly material view of that belief, and the associated belief that as it was for Jesus, so it would be for them–a little less royal but every bit as everlasting.

Tillich’s attempt to recast Christianity in the vulgate of the 1950’s is stale, but not merely stale because it is dated: stale because it is pedantic and wrong–atheism dressed as a bishop, when it was perfectly possible to dress in shirt and trousers and say what you really think and mean: The God of Christian theism is a story.  He does not exist.  All theological projects to prove his existence have failed.  The historical and critical work of the last two centuries have made his existence absurd to increasing numbers of people, making religious beliefs harder to maintain and defend.  This has turned millions of people into seekers, and created a situation which humankind has not encountered before.  Its outcome is still unknown.

That is what Tillich should have confessed because it is what he thought. Yet his solution was to offer sedatives and linguistic figments to people whose imagination, courage and intellect he didn’t trust.  Methadone, as I said, for religion-recovery.

Of Anachronism

Some atheists have proposed that it is possible to be good without God. They’ve plastered the slogan on buses, developed websites, and sold t-shirts to press the point home.  In a minor spin of the same message, other atheists are saying that despite what “religious people” (or often simply “religion”) says, you don’t need God to lead a good and meaningful life.  If the meaning of these slogans is that millions of people find moral value and meaning outside the constraints of religious faith, I agree–wholeheartedly–and I think I am one of them.  I challenge anyone to a duel if they say my love of art, music and literature is deficient; and I will shoot first.

At first flush, these seem like eminently reasonable propositions–as unarguable as Dr Seuss’s assertion in Horton Hears a Who that “a person’s a person no matter how small.” It’s the language of the culture of self-esteem.  And it tells us that, despite anything Dostoevsky might have said a hundred (plus) years ago, it’s the absence of God that makes us all equally worthy; the moral universe does not collapse with his non-existence.

On the contrary, the presence of God, or at least a law-giving god like the biblical god,  creates a value system and a moral hierarchy that modern women and men find unbearable.  There is no universal human equivalence in this God’s world, only saints and sinners, law and law-breaking.  I reject that system as vigorously as do my atheist friends. There can be nothing like a human moral system–a system good for humans–apart from humanity.  Many atheists believe this– and many religious people, even if they don’t, will eventually have to face up to it.

Unfortunately, atheists at this point often try to press their case by cherrypicking the most obscene passages of the Old Testament and raising questions about the mental capacity of people who (they seem to allege) believe the verses still apply. Should parentsLapidation: fun for the whole family be permitted to kill disobedient sons after a cursory inquiry at “the city gates”?  Should fathers be able to sell daughters in slavery?  Is a woman unclean (untouchable) for sixty-six days after the birth of a female child?  Does the definition of rape depend on whether it happens near a city or in the country? Is God so petulant that he needs to destroy a world he could have made better, thus causing his non-omniscient self, not to mention his creatures,  endless trouble?

The relative ease with which these questions can be tossed aside in disdain should clue the reader to the fact that he is not reading an engineering textbook, that he is trodding on unfamiliar, primitive soil.

If you can read this, do what it says...

The script for these objections changes slightly, but the underlying assumption of an unbelief-ful realist doesn’t: The common notion is that if you point out tirelessly what a silly book the bible is people will eventually begin to read it, see the absurdity, and say “Eureka: what an idiot I’ve been.”

I think these Aha! moments actually happen in certain cases, but the great majority of believers really don’t care about the absurdities, and the more “faithful” they are to the traditions of their church, the more they will know that the tribal contexts of Old Testament justice (exception being made for the recent use of lex talionis on bin Laden) don’t form part of the living voice of religious tradition in the twenty first century–just as they haven’t for almost a millennium.

Maybe, as an axiom, unbelievers should flirt with the idea that things that are regarded as anachronistic or irrelevant by the vast majority of religious people are not the best evidence against theism.  That is why, for example, most philosophy of religion anthologies that include a chapter on “Descriptions and Attributes of God” deal with properties and not irrelevances skimmed from the pages of the Bible.

Anachronism is a putative pitfall in constructing any historical argument.  To see how, don’t think Biblical law and custom–Think Hamlet. I remember thinking, the first time I read the play, that all the violence could have been avoided if the young prince had just called the police.  (Never-mind that if that had been an option Shakespeare would not have had a tragedy)  After all, the evidence was all on Hamlet’s side.  Polonius might have testfied. Even Gertrude might have broken down and ratted on Claudius, and Claudius himself was not exactly a bastion of resolve.  Instead, it all ends badly with everyone dead, including Hamlet.  Fortunately I did not offer this solution on my final exam.  It would have been my Paris Hilton moment.

But, no doubt, you’re way ahead of me. Hamlet doesn’t call the police because there weren’t any. Armies, sure, but armies weren’t usually called in to settle domestic spats, not even ones involving murder. Shakespeare wrote the play based (perhaps) on a thirteenth century work by Saxo Grammaticus–when justice was even more primeval and unavailable than in his own day, and where honor, shame and vengeance were largely governed by family honor and local magistrates (judges)–closer therefore to the Bible than to modern practice.  Ultimately, the stories about heirs, usurpers and murder can be traced all the way back to David and Saul, or to Isaac, Esau and Jacob.

When did “crime” become a police (literally, a city) matter and not something to be dealt with in feudal or family fashion? 1822, when Robert Peel founded the London constabulary–a move opposed by many people in London (and it was, at first, just in London) because the city folk didn’t want a government agency getting between them and justice. Objections persevered north of the border in Scotland and in the Appalachian mountains of Tennessee in the tradition of clan violence. The first “bobbies” were drawn from the lower ranks of society; many were drunks and bullies–uniformed thugs who meted out justice in strange ways.  When in 1833 Constable Robert Culley was stabbed to death while breaking up an unlawful meeting, a jury acquitted the murderers and a newspaper awarded medals to the jurors. Let’s not even talk about Boston and Chicago in the nineteenth century.

Our sense of justice and the control of crime is a peculiarly modern invention. Yet we’re perfectly willing to accept (without knowing much about its evolution) that things were different–once. We don’t give a second thought to the fact that the meaning of justice has developed along with ways of enforcing and distributing it.  And without getting into the politics of a recent international event, we (many, anyway) don’t really interrogate the sentence “Justice was done” when clearly what is meant is “Vengeance was exacted.”  The recrudescence of biblical justice in exceptional cases, like poverty, is something we have to expect.

Scales--yes--but the sword is bigger

So I am curious about why the most universally abhorrent and rejected verses in the Bible should become symbolic of the entirety of the biblical world view. Why do we accept gratefully the social evolution of secular justice but deny religion the right to its own conceptual evolution by insisting it must be held accountable for things it produced in the Bronze Age? If evolution is the key to understanding how the world has come to be the way it looks to us, what’s the point in insisting that the religious landscape is unchanging?  I frankly cannot imagine a more tendentious assessment of history than that one.

The fact is, whatever he may or may not have said, you will not find Jesus of Nazareth enjoining the poor to sell their children into slavery to raise some quick cash.  But Hebrew settlers a thousand years before him probably did just that.  You will find him exhorting a rich young man to sell what he has, and give it to the poor, in order to be a worthy disciple. A thousand years before, to the extent that this history is known to us, such advice would have been feckless, almost incomprehensible.  It is similar to my wondering why Hamlet didn’t call the cops on Claudius.

Even the Hebrew Bible shows the slow and deliberate growth of a moral conscience over its millennium-long development: Like any idea that lasts longer than a day, God evolves:

This is what the Lord says: Do what is just and right. Rescue from the hand of his oppressor the one who has been robbed. Do no wrong or violence to the alien, the fatherless or the widow, and do not shed innocent blood in this place. (Jeremiah 22.3)

And let justice roll down like waters, and righteousness like an ever-flowing stream. (Amos, 5.24)

You’ve heard it said, An eye for an eye, a tooth for a tooth [Exodus 21.24]. But I say to you not to succumb to evil: but if one strikes you on the right cheek, turn to him also the other.” (Matthew 5.39f.)

None of these comments constitutes a moral system; I may not accept or believe them (especially my “obligation” to an enemy) and the Church itself has fallen shamelessly down if  the advice of Matthew 5.39 is taken at face value as a standard for all Christians.

But simple historical honesty requires us to notice the change, and along with that (note well,  my friends who tout the iron law of evolution in all things progressive) that the advantageous ethic, the one that looks for compassion and generosity rather than vengeance and payback, is the one that survives the predations of history.  Not perfectly, but more adequately.

Frankly, atheists will get nowhere with the message of “good without God” and its accompanying parody of religious ethics and its drone about the pure awfulness of the Bible. They might succeed in persuading themselves of the rectitude of disbelief by creating a litany of biblical absurdities.  But then the core principle of development, which is really at the heart of the atheist worldview, is laid aside in favor of a partial and static view of history that careful investigation won’t support.

The moral is, you can’t call the police when there aren’t any. And you can’t blame the Bible for being a “moral archive” of how human beings have changed their minds over the course of 2500 years.



A Child’s Sister

I have just read my sister’s obituary in the Lakeland Ledger.

Five years ago she stood next to me, grasping my hand, as we watched our mother die.  Coward that I am, I was the one holding on for dear life. She was the one who escorted me through the rite, just like she’s done for every member of of my family since I was twelve.  As practical as I’ve come to be about theoretical things that don’t matter, she was always the one who was practical about the things that did.

In 1956 my father and mother piled the family into a Nash Rambler on a hot July day and headed from just south of St Louis to Florida.  None of us had any idea why, except my father and mother, and they weren’t saying.  My sister later told me that it was because we lived in the shadow of a lead smelting factory and that I had developed bronchitis–a disease I assumed had something to do with dinosaurs.   Florida and ocean air are good for the lungs, I was told. It might have been true.  She also told me that the dog I left behind, an English shepherd named Brownie, would track us down as soon as she picked up our scent and be in Florida days after we were settled there.  Though it stopped my crying, it turned out not to be true.

My sister, whose middle name was Sue and thus always Susie to a younger, attention-craving, insufferable brother, sat in the back seat next to me in a car without air conditioning for a trip to a state with water rather than Kansas and Illinois on either side of it.

By the time we got to Fort Myers, our presumed destination and where the Mayflower Van was headed with our worldly goods,  my sore throat had developed into a major childhood illness: the mumps. The cure was rest, Royal Crown Cola, and saltines.  When my mother asked why the cola, the doctor said, in a drawl my father strained to comprehend, “Well, have y’all evah tried eatin’ saltines without it?”

As I baked in a cheap motel room outside Naples, my sister wrote letters home to boyfriends she had thrown over, and in the custom of the day applied white adhesive tape and turquoise blue nail polish to a class ring from her last steady.  Whenever she’d collected more than one ring, she sometimes let me apply the nail polish to a second.  But it was her policy never to remove the tape when the ring was returned.

I will always remember Fort Myers as the place where I ate my first piece of watermelon and  learned what blind mosquitoes (“aqueous midges”) were.  Driving along the west coast with increasingly frazzled parents–neither parent had a job to go to and they were now confronted with a homesick daughter and a whining invalid son–my always abrupt mother announced abruptly that we weren’t staying in Fort Myers and we began a slow trek inland.

As we did, as though by magic, the solid wall of biteless ‘skeeters  began to dissipate from the back inside windscreen and we focused on eating watermelon.  Both of my parents were musicians (of a sort) so we sang, loudly and constantly as we chugged unhappily along.  It was during that unhappy sojourn that I got to be “Bloop” to my sister’s “Bleep” in the Drip Song and the female part in “Baby It’s Cold Outside.” Her favorite anthem that hot season was Rosemary Clooney’s version of  “You’ll Never Know,” which I wasn’t permitted to sing with her.

By the time we hit the depot town of Winter Haven in central Florida, a way station for northern tourists en route by coach from New York to Miami in the old Florida East Coast Railroad, we were out of songs, almost out of cash, and the Nash was coughing badly.  I was feeling better. My sister was feeling worse.  Her homesickness had turned into something real.  She’d caught the mumps.

Winter Haven became home, by default.  It had lakes, and palm trees, lots of nice houses, banyan trees, fresh water swamps, foliage like you never saw in the Midwest,  and loads of alligators.  When I got to be a teenager I resented it being in central Florida and so far away from the coast and would occasionally say as indignantly as I could “Tell me again why we’re not living in Fort Myers.” But the story was always the same.  “Your sister and you.”

Our mother found a job, then a better one, and ended up teaching at the local Catholic school.  Our father did what he could do.  Probably having escaped Missouri to avoid working for his German father, and after a financially ruinous try at running a restaurant in Haines City,  he ended up working for my mother’s father.  Worse, as we found out, there were blind mosquitoes in Winter Haven too.

After her one and only year in the local high school, my sister went to New Orleans to study nursing.  The Greyhound trip to Louisiana with my father to see her capped was the biggest adventure of my young life, probably the proudest of his.

She married a boy from “back home,” a usual thing to do, and because back home was still Missouri for her, that’s where he was from.  She had two adorable daughters who became little sisters to me, steadfastly refused–even when they were instructed–to call me Uncle Joe, and spent most of their time seeing if they could squeeze into the little area behind the back seat of my 1965 VW beetle.  In biblical terms, they grew in grace and wisdom.

Years went by.  I moved away.  There were the usual growing-apart pains that always seem to separate brothers and sisters who occupy different spaces, miles apart.  By this point she was the young matriarch of a family that had grown up knowing only Florida as their home.  She returned to school, earned a few degrees and became what many people still call a “legendary educator.”  Having known her in Girl Scout berets, Halloween party masks, with Calomime lotion smeared over her “blemishes” (our mother detested the words “pimples” and “belly”) it was hard for me to acknowledge the legendary part. But you can’t argue with the newspapers.

She had grandchildren. In August, 2007, one of them, her only grandson, was savagely murdered by a local gang. The effect of this on her was so horrible that the less said about it the better.  It is better not even to think about it. It’s just a theory, of course, but it was something she never recovered from.

My relationship with my sister was not always easy.  It was my fault that it wasn’t. I went from being a young brat to an older one, but always a brat. I mistook her endless exuberance for immortality, and when I learned she had cancer I thought the cancer didn’t have a chance. She would beat it.  She would outlive me by a decade at least.

But she didn’t.

Now I’m the last member of the homesteading troupe that rumbled into Florida without a destination, frightened, sick, and cash poor–when Dwight Eisenhower was still in the White House, when the drinking-fountains in McCrory’s said “White Only,” and the Mass was still in Latin. There is no one to grasp my hand this time, and to make the kiss of death gentle and soft.  Meeting my sister’s death is like  meeting death with his mask off and knowing for the first time–really–that this is what happens to us one at a time.

There is one more song she loved that long while ago, and I have been humming it all day.  It helps.

Atheism and Altruism

No predator ever survived by altruism.  No lioness has ever fed her cubs by taking the feelings of the wildebeest into account–never stopped to think, “She may be a mother, too.”

We’re predators, by evolution.  Our eyes are on the front of our faces and we can run long distances and throw things at whatever we can’t outrun. In some areas, we’ve become soft–our canines are almost useless for killing and serious tearing, but we’ve learned to chop and cook our food as a compromise.  Still, we’re predators.  We chase things that run, things that have brains, and we eat them.  I say this with all respect to my vegetarian friends.  And I fully agree, it’s nicer not to have to chase green beans and potatoes around the garden.  This is just the way things have evolved.  God did not make it this way.

Why that preface?  Because one of the things we have stopped doing is eating each other.  As far back as the time of Hobbes, social theorists reckoned that once upon a time when the food supply was short, we would settle for a member of the tribe across the river.  Hobbes called it, without any special reference to cannibalism, “the war of all against all.”

Freud believed that the primal horde was engaged in ritual people-eating from the start, beginning with sons feasting on the father as soon as the patriarch showed signs of loosening his grip on the clan.  Whether Freud (or any later theory) is right, we know that both early religion and early “social contracts” began as taboos against incest and cannibalism.  And we know that the persistence of these ancient customs in the sacrificial systems of early religion and the rationalized forms–in the Christian Eucharist, for example–eating the body and blood of the Lord–is an inadvertent and symbolic admission of the vile things we used to do out of habit and custom.  Every Catholic who takes the “Body of Christ” into his hands on Sunday is unwittingly confessing his cannibal past.

But unless we’re as far gone as Hannibal Lecter we are predators with a conscience.  Predators who suppress the instinct to kill, except in certain ritualized situations like war.  Even predators who ask questions like “Maybe she has children, too.” There is nothing especially Christian or religious about empathy or compassion.  There is something specifically human about it.

That’s why when I read a story this morning about the Texas senate passing legislation to permit the carrying of concealed weapons on college campuses–a right they’ll derive from the Second Amendment with salt from the First–my first thought was that Texas may be the first state to start the slow march of regression back to the primal horde.

Then I read another article in my inbox.  This one came from “Rational Public Radio,” the media organ of the Objectivist Ayn Rand Institute.

What is irritating about RPR is not its express atheism but what its distinctive form of atheism expresses.  For example after declaring that Christian morality is a slave ethic of subservience and empathy for others, the article proposes a better way:

Now, imagine a world where everyone is selfish. Each man wants to have the best life he can. He wants that in the long run, not just tomorrow. This would motivate everyone to be as productive and industrious as they could. They would go to school to learn valuable skills, they would invest and save for retirement. They wouldn’t violate the rights of anyone else, because they know it can only harm their own life in the long run. Such a world would ensure that everyone is working to maximize their own happiness. The overwhelming majority of them would get it too.  If life on this Earth is all we have, then improving and enjoying our own lives can be our only moral purpose. Without a supernatural god keeping score, man must judge actions as good or evil by how they help him and the people he cares about. Actions must be evaluated on their actual impact. Good intentions do not suffice.  There is no rational basis for altruism, and atheists should reject it. You abandoned god, don’t keep his moral commandments.

The seduction of this proposal is that it does something many “regular” atheists find worthwhile.  The ethics of the Bible are based on rules and customs rooted in the Bronze age.  Many of them are outmoded and some are offensive and illegal– speaking just of the Old Testament. Many of the “exhortations” of the New Testament are impractical;  I will never love my enemies or (at least literally) agree to be insulted (turn my cheek) seventy times seven times–and I don’t see the value in it.

On the other hand, feeding the hungry, clothing the naked, giving comfort to those who hunger for justice and peace strike me as pretty good ideas, no matter where they come from.  I do not regard them as elements of a slave mentality.  I regard them as expressions of the same stirrings of mind and conscience that caused us to crawl out of the mud, stand up straight, and make something of ourselves.

The Objectivists have been fond of identifying Christian ethics (why they don’t see other religious systems as equally problematical I don’t know) as “altruistic,” as exercises in self-denial.

Ayn Rand

If you buy this view, then rejecting altruism, as a vestige of Christan ethics, is logically entailed in not believing in God.  It is immoral to try to embrace “logical and rational thought” and to hold on to the “moral indoctrination of childhood unquestioningly.”

“Why should atheists view altruism as the moral ideal? What scientific or theoretical evidence do you have to support it? Have you really examined the subject thoughtfully, or have you unintentionally kept Christian morality even after you rejected god?….There is a rational alternative. An alternative that actually improves human life on Earth. That alternative is rational self-interest. Selfishness. A word that is a smear to some and a badge of honor to others. Acting in rational self interest is the only morality that makes sense in the absence of a god to command you.”

For most atheists, the advantage of living without God is the freedom to love, choose and reflect without the constraints of rules thought to come from a higher power, a Divine Enforcer.

But unbelief does not logically lead to a new kind of determinism, an anthropology that puts individual self-interest above the social conditions that affect the happiness of others.

The glimmers of moral reflection that make sense in Christianity don’t make sense because they are biblical–since much of biblical morality is simply incomprehensible–but because we can see in the advocacy of love and forgiveness and generosity sentiments that are fully humanistic, even corrective of some of the bloodier and more violent passages of the Old Testament.

The Bible doesn’t tell us anything about God. It tells us what human beings think, or thought, about God.  As a human book, it tells us mainly about us, and  is also an important source for the development of the moral ideas of the species.  Rejecting its “supernatural” authority, unfortunately, can’t diminish its significance as a moral archive.  This is the basic fallacy underlying the Objectivist form of atheist thought.

In fact, Objectivism is strangely inconsistent on this point: it’s the New Testament it hates.  The Old Testament history of Israel, which is largely the history of selfish, territorial schemes against its enemies and persecutors, can only be regarded as objectionable to an Objectivist because it’s related to God. It’s core premises are basically exemplary: What could be less altruistic than the story of the Chosen People pursuing their national self-interest without regard for the life and limb of the Unchosen?  What is less altruistic than the events of the Middle Ages and the mid-Twentieth Century that sought to counter this assumption through the vigorous pursuit of national self interest? Empathy was not involved. Predation was.

Natural self interest

The existence of altruism is a hot  topic, almost as important to some people as the existence of God.  As a soft altruist, I believe that empathy, compassion and generosity are important survival skills that we have arrived at over about 50,000 years or so of the “modern” development of our species, which is about 200,000 years old.  Many anthropologists see the development of religion and law as a coordinate of this modern process–an acknowledgement that our distant ancestors could not usually be counted on to do the socially acceptable thing. The archaeological record supports the theory.

As religion declines, however, in terms of the principles of selection that still operate in the human community, it should be fairly evident that patterns of social adjustment that could once only be expressed religiously (or legally) continue to be expressed because they are socially advantageous.  That is to say, some forms of altruism are rational because they work.  They are conducive to happiness, the thing that both Aristotle and the American founding fathers who read him thought was ultimately important to human beings.  They provide cohesion, structure, and a sense of wellbeing superior to their opposites.

An atheism that is rational in this latter sense will reject the temptation to be swayed by the suggestion that “real” atheism means that we have to be guided by our predator instincts.  That isn’t what brought us out of the mud and made it possible for us to look each other in the eye.

What Arab Spring?

I’ve just read a WSJ interview by James Taranto with (who else)  Paul Wolfowitz–“Wolfie” to his friends. That’s right, the guy who had the distinction of working under two Bush’s and two US defense secretaries–Dick Cheney (under George I) and Donald Rumsfeld (under George II) is now looking for a spotlight and a microphone.

Presumably his two disastrous stints in government, the ones that brought us au cours  the beginnings of the Iraq conflict under H.W. and everything else under W. (including a strategy for “finishing what we started”) gives him the right to be quoted.  That’s why Cheney and his elves are making the talk show rounds, trying to find enough crumbs of recognition to make a real piece of pie before the Democrats take it all.

In the interview, Wolfowitz claims that Obama missed the clear signals of the Arab spring. If you missed it too, this is the period that lasted for 28 days in March and April of 2011 before turning into just another sandstorm.

Unable to lecture on the “War of Terror” in the light of last week’s killing of Osama bin Laden, Wolfowitz thinks this is credible change of topic, a technique he perfected in  his attempts to convince a sleepy and perpetually dumb American electorate that Saddam Hussein and Osama bin Laden were really co-conspirators in 9-11, maybe even the same “tearist.”

“Egypt we just bungled completely,” he adds. “I mean, our position was always three days behind whatever was actually going on.” As for Syria, “we’ve failed under both [the Bush and Obama] administrations to recognize how hostile [Bashar] Assad is to everything we want to accomplish in that region,” even when Assad backed foreign fighters killing American soldiers in Iraq. “Now he’s clearly declared himself as an enemy of his own people. At the very least, symbolism matters, and the symbolism of leaving an American ambassador in Damascus. . . . He should have been out a long time ago.  Then there’s Libya…”

Here’s a radical thought.  The reason the United States has not reacted decisively to the Arab Spring is because there isn’t one.  There are only disaggregated movements that can’t be put together into any coherent pattern.     There is instability.  There is agitation. There is unrest and dissatisfaction.  But Peter Cottontail and American style democracy are not waiting to hop on stage.

No one knows what the Egyptians are after beyond wanting–having wanted–regime change (which btw Wolfowitz and his bosses were notoriously poor in providing once the dust of invasions had settled). We now know that even during the headiest moments of the “pro-democracy” gatherings in Tahrir Square, 300 demonstrators were hand raping an American reporter in full view of the crowds.  It just doesn’t have that Lexington and Concord feel, does it?

The latest news out of Egypt suggests that before elections are held, the army will act like any army and rule with its boots when areas of conflict threaten public peace.  When they don’t want to intervene, they won’t.

That’s what’s happening right now with Coptic Christians and Muslims, over what might have been a minor incident under Mubarak. The violence started with rumours that a Christian woman who had converted to Islam to marry a Muslim had been abducted and was being held in a Coptic church. Crowds of Muslims marched to the church and hundreds of Christians gathered to defend it. So far, six people on each side have been killed.

Protective of its numbers, the Copts are hated by the conservative Salafi movement who as recently as last month paraded through sections of Cairo waving placards of Osama bin Laden. (This despite Mr Wolfowitz’s express statement:  “I don’t know of a single instance of these Arab freedom fighters holding up pictures of bin Laden. I know many instances of them displaying American flags in Benghazi or painting ‘Facebook’ on their foreheads in Cairo.”  It is good that Mr Wolfowitz’s current remit doesn’t require him to review events on the ground.

On the ground reports are grim, and almost all agree that  300,000 protesters plus one swallow doesn’t make a spring:

“…Increasing hostility toward Egypt’s Coptic Christians over the past few months has met with little interference from the country’s military rulers…. Salafis have been blamed for other recent attacks on Christians and others they don’t approve of. In one attack, a Christian man had an ear cut off for renting an apartment to a Muslim woman suspected of involvement in prostitution.”

Looking at the total situatiion, Justice Minister Abdel Aziz al-Gind says that “Egypt has already become a nation in danger.”  What would Wolfie have done that wasn’t done? Invade Tunisia?

Putting Wolfowitz’s Egypt to one side, has spring really arrived in Libya? Or would it be fairer to say that no one knows what the end game is, what the prize is, who the rebels are, or why the United States, France and NATO decided on this as a cause célèbre. Has Obama’s circumspection (France taking the lead in a military operation, Printemps en effet!) been just another case of the “tone deafness” his critics accused him of, until April 30th.  Or is it an expression of something more rare? Brains, for example.

Sure, Muammar Gaddafi is despicable.  But he has been despicable for forty plus years (he came to power in 1969).  He was despicable when he acquiesced in the bombing of Pan Am Flight 103 over Lockerbie.  When he ordered the assassination of his enemies abroad. When he bombed discotheques in Berlin.

As to the rebels, the story used to be that what began as peaceful protests erupted into violence when Gaddafi got tough.  The rebels had no choice but to fight back.  The reality is that there has been a steady stream of Muslims from Somalia and other African states into Benghazi (filling a vacuum created by the removal of all government offices to Tripoli) and that the fastest growing sect in the region (known more familiarly as “the rebel stronghold”) is Salafism, which is at the root of conflict in Somalia as well as in Cairo.

We also know that in recent months the rebels have been studying western media, jotting down buzz-words that European leaders like to see in press releases–words like, freedom, equality, democracy, peaceful change–those sorts of words.  Rather than being caught in a doomed situation, where anti-American and anti-Western extremists are certainly among the fighters the allies have now promised to protect, Obama played the steady hand of minimizing US involvement.  It’s just what he should have done.  But it is intriguing to wonder, What would Wolfie do?

Inspirational sermon by Salafi Muslim, Zarqa (Jordan), April 15

Gaddafi has been able to get by with murder by paying his way back to respectability to the tune of a 3 billion-dollar compensation package to be used to compensate relatives of the Lockerbie bombing victims and assorted other casualties of attacks on American citizens.  Who approved it? Who was the broker? Paul Wolfowitz’s boss. George W. Bush, in tribute,  signed an Executive Order (13477) restoring the Libyan government’s immunity from terror-related lawsuits and dismissing all of the pending compensation cases. The man who said he would “get” bin Laden dead or alive and then immediately lost track of him, was all about forgive and forget when oil was involved.

Does this mean that if only bin Laden had had the sense to negotiate, his sins could have been forgiven for a cool trillion?  We will never know because George W. Bush was no longer dealing, and Wolfie had moved on to the American Enterprise Institute to defend his failed ideas for a safer, stronger all-American world.

After praising Morocco (where spring supposedly began, and not typical of anything that has happened since) Wolfowitz says that Obama “bungled” the stirrings of democratic rebellion in Iran and has been slow to capitalize on what’s unfolding in Syria: “Now [Assad’s] clearly declared himself as an enemy of his own people. At the very least, symbolism matters, and the symbolism of leaving an American ambassador in Damascus. . . . He should have been out a long time ago.”

Perhaps he should have been out when he presided over the assassination of Rafiq Hariri, the former Lebanese prime minister and a liberal-reformer by Lebanese standards, during the presidency of George W. Bush in 2005.  The assistant defense secretary who looked on in astonishment and without recourse to any plan when Hariri was blown to bits outside the St George’s Hotel in Beirut was Paul Wolfowitz.

Two years earlier, in 2003, Wolfowitz’s superior, Donald Rumsfeld asked, “Why shouldn’t we go against Iraq, not just al-Qaeda?” According to reports, Wolfowitz claimed that Iraq was a “brittle, oppressive regime that might break easily—it was doable.”  While Wolfowitz states his skepticism about interventions in the WSJ article, his reputation belies any such caution:  In the famous Seymour Hersh exposé (“Donald Rumsfeld Has His Own Special Sources. Are they reliable?” The New Yorker, May 12, 2003) Wolfowitz is depicted as being the cowboy who viewed Afghanistan as swamp and Iraq as the way to get public attention off the losing campaign to find bin Laden. “There’s no way to go too fast. Faster is better.”

According to Hersh, “little effort to provide the military and economic resources” necessary for reconstruction was made.”  What Wolfie cared about was making the case for weapons of mass destruction and convincing the pro-Israeli lobby in the United States that that America was committed to the defense of Israel at all costs–whatever unraveled as a consequence of imprudent action.

Barack Obama was famous before he became famous as the fastest gun in Abbottobad for calling Paul Wolfowitz an ideologue. Some insults just can’t be forgiven. Now that the surgical strike that should have happened ten years ago has happened, the lean and hungry Republicans are circling, growling that torture works, sneering that we were right to go into Iraq (Q: “Who would argue that the world isn’t better off without Saddam Hussein?”A: “Many of the mothers who lost children–4000 before Mission Accomplished and about the same number, under Bush, since.”) and that no matter what people think of Obama today, he’ll be proved a bungler tomorrow.  This is the message of hope being preached by survivors of the last administration.

And it’s got to be true: Just look at how he’s missing all these clues about springtime in Arabia.

Is “Beyond Belief” Beyond Critique?

I don’t know Tristan Vick, the blogmeister at Advocatus Atheist, but I think I like him.

Back in April, when I wrote a series of articles criticizing New Atheism for being loud and obnoxious, Tristan said I was being loud and obnoxious and to put a lid on it.  I was being so persistently obnoxious, in fact, that if I’d replied to the article then I would have been even louder.  So I’m glad I waited. Time’s a healer.

Tristan points out:

Obviously Hoffmann doesn’t know anything about the education of the New Atheists. Sam Harris is a philosopher turned Neuroscientist, and holds a PhD in modern Neuroscience from UCLA. Richard Dawkins is a world renowned evolutionary biologist and he was the University of Oxford’s Professor for the Public Understanding of Science from 1995 until 2008. Christopher Hitchens is an infamous atheist intellectual, a savvy journalist, and graduated from Oxford University. Meanwhile, Hoffman groups other atheists into this ‘unlearned’ category when he adds the abbreviation for and company (i.e., et al.) to his list of passionately despised New Atheists. So I can only assume he means other “uneducated” men like Dan Dennett (Philosopher, PhD), Victor Stenger (Physicist, PhD), Richard Carrier (Historian, PhD), David Eller (Anthropologist, PhD) among plenty of others. For the life of me I cannot seem to figure out how these men reflect the unlearned and unreflective side of New Atheism.”

Well, obviously I know (have always known) all of this, and leaving to one side whether credentials insulate you from being a jerk on occasion (it hasn’t helped me) a couple of other things need correction rather than apology.

The last 18th century wit?

First, I don’t passionately despise anyone–least of all any of the people in the paragraph above.   I hugely admire what every single one of them has done in their academic discipline–from Richard Dawkins bringing science into public consciousness to Christopher Hitchens’s sometimes lone crusade for sanity in the world of politics.

I cannot think of a single person mentioned whose scholarship should be impugned or their credentials questioned in their speciality.  And I am very grateful that Tristan knows and likes some of what I have written in the field of biblical criticism–which he’s obviously into in an impressive way.

The question really is whether when they (or yours truly) speak as atheists they deserve immunity from criticism, since there is not (yet) a professional qualification in the field that would entitle anyone to speak with greater authority on the subject than anyone else–not someone whose field is evolutionary biology, not someone whose field is anthropology, not someone working as a journalist.   Naturally a good knowledge base, like a second Pinot Grigio at lunch, is nice to have, but when we speak about atheism, we’re all amateurs.  If some atheists admire certain people as spokesmen because they’re “raw and rude” (I think I’m quoting PZ on how young people like it), there are others who like it medium-well and slightly tenderized.  You can substitute Chinese-food metaphors here if you like.

That fundamental point is already implicit in the discussion.  I’m guessing that Dr Coyne and Dr Myers don’t bring the language of the blogosphere with them to professional meetings. I don’t either.  One of the joys of blogging about things we’re all equally amateurs in is that we can release the verbal energy diffusely that we can’t use on colleagues directly.  You might want to tell old Dr Jenkins that as contributions to science his papers might just as well have appeared in the Norton Anthology of Poetry, but you won’t say that to his despicable face.  That’s why it’s nice to have a cause you believe in–a mission– and a space to share it with people whose offices aren’t next door. Blogs make us prophets in small kingdoms.  But they still don’t make us experts. Popular atheists shouldn’t mind developing fan clubs and cohorts.  But fan clubs and cohorts should be careful about turning their enthusiasm for good ideas and sexy styles into appeals to authority.  I myself am working on a sexy style.

Without any backup for this, I’d guess that 80% of the best academics at the best colleges and universities embrace some form of unbelief and keep it to themselves.  And besides this, scholarship in most humanistic disciplines (including the study of religion) is implicitly atheistic–everything from history to philosophy to literature.  There’s no room for “supernaturalism”–and that includes God theories–in public or most good private universities. That battle has been won in methodology, if not in the classroom.  If you don’t believe me, try getting an article published in a peer-reviewed journal by arguing that Joan of Arc’s visions were real.

The larger, discussable, popular atheism that seeps out of the academy in the form of books, lecture tours, debates and blogs (no, I’m not saying it all originates there; but Tristan’s list suggests that it is a major pipeline for discussion and feeds into a thousand internet channels) isn’t subject to the same kind  of “peer review” that scholars expect when they are speaking or writing as professionals and experts in their field. That’s what makes the “raw and rude” atheism of the blogosphere different from the assumed and methodological atheism of the academy–even though the two forms aren’t opposed and not really in conflict–except as to tenor and style.

Unfortunately, the people-part of popular atheism won’t always cotton to the sometimes elitist-feeling, genteel-seeming atheism of the marble halls.  Ask anybody in the list above who has been in full-time academic employment and climbed the tenure ladder about the process: the answer will be roughly the same. No professor would last very long if she mimicked or abused the religious sentiments of a religious devotee during a classroom discussion–no matter how strongly she’s convinced that education means, among other things, getting over it.  When I see atheist comrades being a little too–how you say in your language–robust in this matter once freed from the shackles of classroom teaching, I have to admit my discomfort.  Easy enough at this point to let sparks fly: I seem deficient in my commitment to the truth. (As in Hoffmann coddles believers).  And my plainspoken colleague seems deficient in kindness and generosity.  But can’t we have, or try to have, both?

Within the last five years I was asked directly by a [here nameless] department chair (and I quote) “How does your atheism affect your teaching of history.”  I responded somewhat pointedly that if he had asked that question of a Catholic or a gay I would report it to the dean, but as it was about atheism I would let it ride.  He was curious, so I said, “Because even though there is no God,  he has played an enormous role in human history.” (He found it amusing.)

Toynbee

Does the fact that in popular atheism ideas are thrown onto the battlefield and caught in a crossfire mean that there should be no review or critique of what atheists say at all?  That doesn’t seem likely, does it? There has to be review, there will always be criticism.

But that doesn’t mean that atheists should leep quiet about each other when they find members of the home-side bending the rules of healthy discourse. That includes me. It needs to be said that not all outrageous statements, even if they’re funny, benefit atheism. And I think name-calling and petulance hurts all of us.  In saying this, I hope for agreement, not a dozen replies that begin “See, Hoffmann is learning.  There is still hope.”

Once upon a time, a guy could get excommunicated from the Church for calling a preist a bastard, even if the priest was one.  In some states (believe it or not) it is still a tort (libel per se, or something equally preposterous) to speak ill of (cough) a lawyer.  Academics have never enjoyed such privilege.  That’s a good thing, as long as we keep the discussion at the level of ideas.  Unlike priests and  lawyers, there is nothing sacred about being an academic, despite the fact some academics would like there to be.

So here’s the deal.  As long as we’re clear that academic credentials confer no privilege or special dignity in a discussion–a conversation that has to be democratic, no matter how close to the earth we walk–I completely agree that calling people “superjerks” is out of bounds.  We need to develop language that shows the big old largely religious world that atheism isn’t coming apart at the seams.  Again.  Tristan says,

“Criticizing atheism, mind you, is a good thing. It helps us persistent, loud mouthed, fundamental atheist types check our arguments and hone, refine, and improve them. Criticism only seeks to make us stronger critical thinkers. We can learn from positive as well as negative criticism, and criticism allows us the opportunity to learn from our mistakes, perchance to grow better and learn to reason better. But Hoffmann isn’t offering advice; he’s being a dick.”

Can’t say I love being a dick, but I do love what he says about criticism. The worst thing unbelievers can do is split up into grumbling factions of science-atheists, humanities-atheists, and social science-atheists (talk about dicks: just kidding) to see whose atheism is the purest form of the product.  I think keeping the discussion going, even if it occasionally roils into disagreement and criticism, is better than sulking or going it alone. There’s a lot we have to talk about to each other in a world that winks at the grief caused by religious devotion but scorns the wisdom that unbelief represents.

So Tristan: while I can apologize for being a dick,  I can’t apologize for being critical, and don’t think you’d want me to.  When I am all grumbly and obnoxious, I really don’t mind your telling me.

We all need to get to know each other’s ideas a little better.