- View Issue
- Subscribe
- Give a Gift
- Archives
But Is He a Christian?
Sam Alvord’s article “But Is He a Christian?” [September/October] caught my attention from at least three directions: one, its provocative title; two, from my interest in novelists who are attempting to write, at least, “Christianly”; and, finally, the fact that I have a daughter studying humanities at Houghton College.
My question, as I read the article, was similar to that of Alvord’s students. Not, however, having read any of David James Duncan’s novels or short stories, I found that my argument rests, at least for the moment, not with Duncan but with several statements or assumptions made in the article.
One paragraph, in particular, arrested me:
Still, I honor the question, because my students come from homes or churches or colleges where the evangelistic imperative has lofty status. I demean them and their history insofar as I scorn it. Where else for them to start to gain their bearings as independent seekers? For most of them, I realize, an answer to this question is tantamount to establishing true north as they strike out as adults into a culture that offers passage to many diverse spiritual and philosophical compass points. I must not confuse my personal resistance with their essential right to begin their journey from their home.
Of course, these students, my daughter included, must become “independent seekers.” But Alvord seems to call into question whether there is, indeed, a “true north.” Since Houghton College, in its doctrinal statement, affirms the Scriptures as “fully inspired of God and inerrant,” as well as “of supreme and final authority for faith and practice,” I have felt it safe to assume that there is a true north to be sought—and found. Are there really “many diverse spiritual and philosophical compass points”? Are we free to choose from any of them? How will an “independent seeker” know when he or she has found the right one?
I seem to recall that Jesus made the statement, “By their fruits you shall know them” (Matt. 7:20, emphasis mine). And it seems to me that the rest of the New Testament spends an awful lot of time and words telling us how that should look and be acted out in our personal lives, with our neighbors, our mates and our children, in the rest of society, and, begging everyone’s pardon, in the church.
I find, too, at least a hint of an implication in the little phrase “from their home,” that a child must necessarily make this journey philosophically as well as physically and emotionally. If my husband and I are attempting—no, struggling—to teach and model truth as we see it based on the Scriptures, must our children depart from our teachings? (Whatever happened to the “faith of our fathers”?) And, if they do depart, what “truth” will they find? Why are we sweating so much blood and tears building up “the Christian home” if it is a given that our children are going to depart anyway?
I have heard two Christian college presidents say, in their charges to incoming freshmen, that this moment signals a departure from their parents. Is it beneath our intellectual pride, even our Christian intellectual pride, to suggest that just perhaps they may also return?
Lest you suspect otherwise, I am not against reading such novels; quite the contrary. Yet, I think we have a distinctly clear compass point against which to measure whether our boat has wandered off course or, indeed, foundered. How can Alvord’s students go out into the world with a confused identity and expect to make followers of Christ?
Vivian HyattBudapest, Hungary
Language About God
I am sure the motivation for Kelly James Clark’s recommendation for a “modest transcendence” [September/October] was a positive one. I am sure he intended to “summon us to a quiet confidence” in God and remind us of the partial character of all knowledge. However, Clark’s proposal of a modest transcendence makes a number of important theological mistakes.
To suggest that language about God is similar to language about rocks and persons violates a key principle of a proper creation doctrine. God is not in the same class as created objects. God is not part of any set or subset of things. Language about rocks and persons may have its own limitations, but they are not the same limitations that bear on language about God. I have only partial knowledge of a rock, but this is not the same limitation as my partial knowledge of God. The rock, even in its mystery, is a “creature,” even as I am a creature. My efforts to understand the rock do not in any sense impinge on the rock’s freedom or sovereignty. These efforts are on a horizontal plane, one creature seeking to understand another.
God is not a creature, but the Creator. My efforts to understand God are not horizontal, but vertical. Any effort to understand God and then find language for God must be in full awareness of God’s primacy, freedom, and mysterious love, which is both ontologically and epistemologically beyond creaturely limits.
Perhaps another way of putting this is to raise the old distinction between analogia entis and analogia fidei. It seems to me that there is an implicit analogia entis in Clark’s argument: Because we can have partial knowledge of concrete objects and other human persons, therefore, we can also have partial knowledge of God. God, on this line of approach, is on the same spectrum of reality as the created universe, and the same epistemological principles apply. Far from preserving the greatness of God and the confidence of the believer, this approach reduces both God’s splendor and thus human confidence and trust. An approach of analogia fidei is a better way to go: God is beyond all human comprehension, yet in freedom and love, God has willed to be God-for-us in Jesus Christ. We know who God is through Jesus Christ as witnessed in the Scriptures. Our knowledge of God is a gift of God; at no time is it our own epistemological achievement.
Clark finds support for his position in the great thinkers of the faith. Yet, I believe Augustine and Aquinas, as well as others like Calvin and Luther, were convinced that all our language about God is inadequate—not just some language about God is inadequate. It’s not merely that language about God has limits, it has no inherent ability at all to convey God. Augustine’s rule that “if we have understood, then what we have understood is not God” includes, I think, a possible corollary, namely, that “if we have understood partially, modestly, then what we have understood is still not God.” All human language must come under the radical critique of God’s transcendence.
But this does not entail a remote and distant God. Quite the contrary. Such a God has from all eternity willed to be a God for humanity, a God who in freedom and love is for us. To find provisional and faulty human words to express this reality encourages the kind of confidence and faith in God’s benevolence towards us that goes well beyond the proposal of a modest transcendence.
Leanne Van DykSan Francisco Theological SeminarySan Anselmo, Calif.
Kelly Clark replies:
Leanne Van Dyk accuses me of an odious and obvious theological error, treating God like a rock. Although this might be OK for psalmists and Fanny Crosby, it’s not OK for me.
I am baffled, however, by her criticisms. I don’t claim that God-language is similar to rock-language except that both are woefully limited in capturing the full reality of their proper subject matter. Rocks and gods vastly exceed human language and comprehension. I neither said nor implied that God is a creature. I do claim that some person-language may apply to God if we were created in God’s image; if not, then not. So there is something of a conditional analogia entis in my argument; mea culpa. Of course, even if we were created in God’s image, God will have many properties that are not like any properties possessed by humans, so God may need to reveal those to us either through nature, reason, personal experience, or revelation. And God will surely have properties that could not be revealed to humans—we could neither experience nor comprehend many divine attributes. All of this is consistent with my modest claim that we can, God willing, have sufficient knowledge of God. If God wished to reveal such truths to his creatures, our knowledge of them would be, to quote myself, “piddly.”
I am puzzled by Van Dyk’s claim that religious knowledge and language “must be in full awareness of God’s primacy, freedom, and mysterious love which is both ontologically and epistemologically beyond creaturely limits.” How can we possibly be in full awareness of anything that is beyond our linguistic and cognitive limits?
If “God is beyond all human comprehension,” then we could not comprehend God even were God to reveal Godself in Jesus. Van Dyk can’t have it both ways: Either God is not beyond all human comprehension, or God could not possibly reveal Godself in Jesus. Her proposal that we use “provisional and faulty human words to express” divine reality (as revealed in Jesus) is not, as she claims, contrary to modest transcendence; it is precisely how a Christian might express modest transcendence.
If language has no inherent ability to convey God, as Van Dyk asserts, then we can know nothing, even piddly, about God. Such a claim smacks more of agnosticism than theism. We are left to worship what Locke fetchingly calls the “something we know not what.” We could not know anything at all of the truth that Van Dyk proclaims, that God “in freedom and love is for us.”
Christian Publishing
In his article “Christian Journalism on Trial” [September/October 1997], John Wilson takes me to task for an article that I wrote for World magazine on the state of Christian publishing, as represented in the Christian Booksellers Association (CBA). He agrees with my assessment of the “spiritual junk food” that takes up so much of the shelves in Christian bookstores. He wishes I would have stopped there.
But what I continued to do was probe the reasons why the CBA turns out so many bad books. I argued that there is so much money at stake in the bookselling industry that commercial considerations are trumping theology. Since the largest publishers are now owned by secular corporations and stockholders, the overriding concern has to be turning profits, effacing what used to be the “ministry” concerns of their founders.
Now there is nothing particularly conspiratorial, paranoid, or even conservative about this line of criticism—railing against big corporations and uncovering the underlying economic interests usually characterizes scholarship from the Left. This is a concern, however, for conservative Christians because it has resulted in a religious commercialism that replaces the Word of God with what people want to hear, a theological consumerism that has created a pop Christianity increasingly removed from the real thing.
Wilson completely misreads key sections of my argument. My evidence for Rupert Murdoch’s influence on Zondervan is not the cancellation of an academic line but the memo sent from the Murdoch-owned parent company HarperCollins telling Zondervan to publish only titles that would sell above a certain quota, thus requiring them to be governed by the mass market. My account of Frank Peretti’s career by no means implies that I think he was a great evangelical writer corrupted by the world; rather, it shows the struggles of small publishers when the big players buy away their most successful writers.
The main complaint about my article is that it is unbalanced, and Wilson lists scores of worthy titles coming out from Christian publishers. But I did say, at several points, that many good books are being published, that there are many fine writers and devoted editors working in evangelical presses. My emphasis, however, was on religious commercialism, which is the problem I believe needs to be highlighted. The fine books mentioned by Wilson, however, are scarcely to be found on the shelves of most Christian bookstores, which tend to promote instead the pop theology, self-help books, celebrity mongering, and formula fiction that I am complaining about.
My problems with InterVarsity Press and Eerdmans are those of a frustrated admirer. These publishers, to their credit, are resisting the temptation to crass commercialism and still put out serious and challenging theological works. But, while they are putting out some good material, as I say in my article, they are also flirting with what could only be described as theological liberalism. This is disappointing to us theological conservatives.
I think that the main complaint about my article is evident in the subtitle of Wilson’s critique, with its arch little quotation marks around “biblical” journalism. The Christian subculture is not used to criticism, much less self-criticism. World is trying to take a prophetic stance against the evils both of society and the church. Though some might think we are wrong or perhaps too conservative in our theology, our critics have been attacking our alleged lack of politeness rather than dealing with the substance of the issues we are trying to raise. Like Jeremiah, our words may sting, but our lamentations over those we criticize and yet love are heartfelt and sincere.
Gene Edward VeithCulture Editor, World
John Wilson replies:
Rather than reiterate the important points of agreement and the substantive differences between Gene Edward Veith’s account of Christian publishing and my own, I would urge the interested reader to take a look at his cover story in World (July 12-19) and my response. One point in his letter, however, needs clarifying here. Veith concludes that the “main complaint” I bring against his article is World’s “alleged lack of politeness rather than dealing with the substance of the issues we are trying to raise.” This is emphatically not the case. Indeed, I commended World for the magazine’s “robust engagement with critical issues.” Our differences have to do with truth, not with style. Veith claims, for example, that at Zondervan, the “overriding concern” is “turning profits, effacing what used to be the ‘ministry’ concerns” of such Christian publishers. In contrast, I have argued that ministry concerns are still vital at Zondervan.
Nor is World at all alone in offering sharp criticism of fellow Christians. The self-flattering notion that the “Christian subculture is not used to criticism, much less self-criticism”—where World is cast as the lonely, courageous, prophetic voice—will not stand up to even 30 seconds of historical examination. I would suggest for starters that Veith read Joel Carpenter’s Revive Us Again: The Reawakening of American Fundamentalism (reviewed in the November/December issue of B&C) for a reminder of the withering controversies that have characterized evangelicalism throughout its history and continue to do so today. Unfortunately, all too many of these controversies have been marked by the conspiratorial thinking and the misleading treatment of evidence that mar Veith’s account of Christian publishing.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
John Wilson, Editor
- View Issue
- Subscribe
- Give a Gift
- Archives
As evangelicals we should remain steadfast in our affirmation of the primacy of Holy Scripture, but this should not be taken to imply scriptural exclusivity. We need to reason beyond the parameters of Scripture as we relate the doctrines and themes of Scripture to the new challenges to the faith posed in every succeeding generation. Just as the church came to the right conclusion regarding the two natures of Christ and the Trinity through deliberate and prolonged reflection on Scripture but also drawing upon the intellectual tools provided by the culture of that time, so the church throughout history and in our time must wrestle with the implications of the message of faith in dialogue with all other Christian communions. Yet we must never forget that church tradition can be deceptive, that again and again it is tempted to transgress the limitations imposed by Scripture, that it needs itself to be continually purified and reformed in the light of a fresh appropriation of the Holy Spirit. Church tradition can be a salutary guide to faith but only when it functions under the ruling authority of Holy Scripture.
—Donald G. Bloesch, Jesus Christ: Savior and Lord
Something strange is happening among America’s cultural elite. Small groups of the intelligentsia—novelists, neurosurgeons, staff writers for the New Yorker—are meeting with rabbis or pastors to study the Book of Job or the Gospels. People who wouldn’t have been caught dead with a Bible five years ago are poring over Scripture with the zeal of seminarians.
There have been signs of this surprising surge of Bible study for some time. One is the popularity of literary approaches to the Bible, including a whole shelf of volumes in which writers of one sort or another (not biblical scholars, that is, but creative writers), generally heterodox, take a crack at interpreting this or that chunk of the Bible. A current example is Joyful Noise: The New Testament Revisited,a collection of original essays edited by novelists Rick Moody and Darcey Steinke (Little, Brown, 247 pp.; $23.95, hardcover). Moody explains that “the idea for this anthology came suddenly and organically, at a dinner party, in the midst of a conversation about the contemporary hegemony of the religious right. That night, I blurted out to friends that the only way to oppose these contemporary moralizers was to engage in a debate about the meaning of the New Testament.” (For an overview of literary approaches to the Bible, see Leland Ryken’s article, “Bible Stories for Derrida’s Children.”) Another sign is the Genesis phenomenon. (Look for David Jeffrey’s survey of the bumper crop of Genesis books in a forthcoming issue of B&C.)
There’s one problem with many of these Bible studies for intellectuals: the participants feel free to discard anything in Scripture that rubs them the wrong way. For Rick Moody, that includes “the repressive, the punitive, the intolerant” image of God and morality “defined in the Old Testament, in the Book of Revelation, in some of the Pauline epistles of the New Testament.” The model for these Bible studies is a “conversation” in which God (if he exists—and you are free to assume he doesn’t) is a partner among equals. You know those people who always have to be right? God is like that. He has some interesting things to say, but at times he tends to be domineering, and at other times he’s plain confused—at which point you simply have to put him in his place.
“This radical notion—that the Bible not only isn’t factual, it’s not always right, either—may be frightening to many religious Christians, but it’s what lets participants join this ancient and ongoing conversation. We do not have to buy everything the Bible says. We just have to listen to it and to each other.” So says Ann Monroe in “Does the Bible Tell Me So?” (Mother Jones, December 1997), an excellent guide to the assumptions underlying this user-friendly mode of biblical interpretation.
Monroe quotes the poet and translator Stephen Mitchell: “If you approach the text as ‘truth,’ you can’t possibly get to a deeper place of intimacy with it. With only one pole, there’s no place to go.” So what have believers been doing all these centuries when they thought they were immersing themselves in God’s Word and experiencing intimacy with him? Mitchell explains what they missed: “Conversation is one of the deepest and subtlest ways of play and growth and intimacy, and it’s a bipolar experience.” Who’s going to tell Saint Teresa, and Martin Luther, and Chrysostom, and Saint John of the Cross?
Rick Moody is absolutely right: “Every generation interprets the Bible for itself.” And yes, that interpretation, as Donald Bloesch acknowledges, should be undertaken “in dialogue with all other Christian communions”—and with Jewish readers as well. But the “ruling authority” for each generation’s interpretation must be Scripture itself. By what authority do Moody and Steinke and Monroe and Mitchell decide what in the Bible to accept and what to reject, if not by that very “truth” they disdain? (Here we are in the upside-down world illuminated in Philip Yancey’s article in this issue.)
In Monroe’s account, “conversation” is a mantra: “To read the Bible as a conversation is to read it as a question, not an answer, a starting point, not a final declaration.” Well, questions and answers both have their place, it would seem; does one without the other make any sense? By all means, let us have “imagination and engagement,” as Monroe suggests. There are rumors that the Holy Spirit is not averse to imagination. But let us not suppose that we are the judges of Scripture, when it is Scripture that judges us.
Copyright © 1997 by the author or Christianity Today/Books & CultureMagazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromJohn Wilson, Editor
- View Issue
- Subscribe
- Give a Gift
- Archives
The Cost of Living in a Suburban Paradise
I live in a peculiar town. A friend refers to it as the most “unreal” place she’s ever been. It is not that its residents are particularly different from those elsewhere—on the contrary, they are much like anyone else. Rather, it is peculiar because those who live here have been successful in creating a place that looks to many of them like utopia.
I was reminded of this recently after taking a tour of my town with my daughter’s Girl Scout troop. As we drove by the sprawling police facility, we were informed that our police are efficient in catching those criminals who dare to enter our town. One envisions ancient city walls lined with valiant sentries keeping the enemy at bay, while peace and harmony reign within. To some in my town, this vision is real.
But our peculiarities do not stem solely from our view of the outside world. A unique and fascinating culture has evolved within our city walls. We are the land of “Soccer Moms,” that potent political force of the 1996 election. The political influence we were supposed to have was lost on us—we were spending too much time carpooling to have any meaningful involvement in the political process. My family’s recent attempt to enjoy a relaxing summer was squelched when we couldn’t find any free time in between day camps, swimming lessons, and soccer.
We are a town where cats must be kept on leashes. A town where we cry “taxation without representation” when our cul-de-sac isn’t snowplowed in a timely manner. We are a town where the planning commission prefers to see groundbreaking on yet another strip mall rather than a church. A town whose motto might just as well be nimby—”Not in my back yard!”
We are what Joel Garreau writes of in his book Edge Cities: Life on the New Frontier. Edge Cities, according to Garreau, are more a state of mind than a physical place. Our boundaries might not be clearly defined on a map, but we know what we are. We are a self-contained suburban area that “has it all,” from jobs to shopping to entertainment. We are “the culmination of a generation of individual American value decisions about the best way to live, work and play—about how to create home.”
Garreau refers to such cities as “monuments to the maximization of the individual ego and monuments to profit.” My town is what results when people with financial resources attempt to create the perfect environment. It is a place where we have at least the illusion of control.
While there was a time when I self-righteously rejected the notion of living in such a cloistered environment, I have to admit that I like it here. Why wouldn’t I? I live in a world that has the “best” of everything: Great schools, low crime, perfect lawns, a quaint downtown, a superb park district, and, best of all, the ability to purchase anything I want or need 24 hours a day. My early negative perceptions of such an environment have been tempered by the discovery that both my old city neighbors and my new suburban neighbors are really very much the same. Both want control over their environment. But it is my suburban neighbors who have succeeded.
So what is the problem here? Why rain on this parade? Because I believe the problems that come with the building of these walls can be significant, particularly for the church. We must come to terms with the social isolation that we have created and actively maintained. Church consultant Lyle Schaller believes that suburbanization tends to divert our attention (and thus our involvement) away from larger issues “out there,” while focusing our attention on what is happening in our own communities. This myopia is particularly strong in Edge Cities, where we are self-contained and often have no reason to venture outside our walls.
We have come to believe that the concerns of our community are the only thing to be concerned about. Thus, we expend our energy on perfecting our community: we demand more from our schools, more from our municipal government, and inevitably, more from our churches.
Schaller points out that the demand by consumers for higher quality products extends to churches themselves. People choose big institutions (churches) because they want choices, convenience, a strong consumer orientation, and specialized services. People want the best in preaching, teaching, and children’s programs. Can you blame them?
A statement by Garreau caught me short. In contrast to the old idea of a parish church, he observes, “A large modern church functions like nothing so much as a spiritual shopping mall. It is surrounded by a very large parking lot located astride a good network of roads.” And while I may be carrying the analogy further than Garreau intended, it left me wondering. Have Edge City churches become spiritual shopping malls, complete with products I can pick and choose from, selecting only that which catches my eye, fills my needs, and makes my life more comfortable?
From my perspective, they often have. Possibly by necessity. When in Rome, do as the Romans do. It may be necessary to be consumer-oriented in order to draw the consumer in. But what then?
The challenge to the church only starts here. The challenge is to take those consumers who enter our churches seeking to be served and to be fed and bring them to a place where they seek—with a passion—to serve and to feed. The challenge is take those who worship a god they can define and control and transform them into servants who are willing to cede their control to an Almighty God. The challenge involves breaking down the walls, stepping over the rubble, and venturing out into a world not of our own making.
It takes more than good intentions, however, to venture through the rubble into a strange world. When I have attempted to reach across racial and economic barriers, to engage in real interaction and not just handouts, I have been frustrated by my lack of ability to control others and their environment. I have been frustrated by my inability to “fix” them. If control is such a priority within the walls, why wouldn’t it be on the outside? Thus, I return to my cocoon. The pull of the comfort and control within these walls is far stronger than the pull to venture out.
I recently visited an inner-city church that has a vision far beyond its material resources. As we stood and prayed in an abandoned auditorium that will someday be a House of Prayer, my joy for this church was tempered with sorrow. Not at their poverty, though they are indeed very poor. And not for the monumental task before them, though the work to be done is overwhelming. No, my sorrow stemmed from the fact that they were witnessing the power of God in ways that my suburban church cannot yet see. Their vision is so beyond their control, so beyond their resources, that they must allow God to provide—and provide he has, in incredible ways. They have the privilege of seeing God work with a free hand, not constrained by those of us who insist on being on the planning committee.
Deborah Windes is a consultant and a writer. She lives in Naperville, Illinois.
Is Christian Celebrity Oxymoronic?
Ushering a bishop into a big autographing event, I give him a four-inch button imprinted with the title of his book. He winces. “Won’t wearing this be incredibly self-serving?” “Yes, of course,” I nod, grinning.
In 20 years as a publicist I’ve seen many changes in how media perceive religion. Faith was once a talk-show taboo, back when we had taboos. Media have come to realize that religious topics can hook audiences, especially when advocates become hostile. Many media, however, also appreciate our longing for inspiration. What builds ratings today reflects the ambivalence of a society that seeks heroes yet is suspicious of what is noble. Working with inspirational books constantly shows me just how enmeshed I am in this conflict.
The pastor was not telegenic. Nor did he project the energy which radio producers believe lights up the phone lines. (“We love religion,” one of them says, “It makes people fight.”) The pastor declined to let me use what would have been the headline material; he wouldn’t name the Names who had abandoned him. I had to sell what they call a soft segment: a good man’s faith and fortitude helped him endure evil and to forgive his persecutors even as they inflicted pain. I devised a candid yet self-protective way to pitch the gentle pastor: “He isn’t a Type A dynamo, but he’s articulate, and his story is compelling.”
Sales of his book were modest. The PR was handicapped by humility. Unfortunately, humble isn’t what many media want when they cover religion. They know audiences respond to feeding frenzies. Freak shows. Culture wars with commercial breaks. Soft doesn’t sell in any format today. New Yorker editor Tina Brown says that it takes a shriller, hard-edged press to compete with TV: “Once in a while you have to bite the hand that reads you.”
The power and ironies of publicity haven’t changed much over the years. Yet the electronic age creates bizarre situations as we get further removed from the sources of our stories.
A publicist in my office sits beside a new, eager-to-learn intern. She’s like a sponge, this girl, soaking up media and author experiences. She’s excited about discovering the real and often peculiar PR world beyond the classroom.
The publicist hangs up after talking to a lawyer who had hired a ghostwriter to package his story.
“So, what was the matter?” asks the inquisitive intern, who had overheard the publicist rehashing an interview with our “author.”
The publicist explains: “He’s disappointed because the guy who interviewed him hadn’t read his book.”
The intern is puzzled: “Why? He didn’t write it!”
Celebrity has become so detached from achievement that we can discuss famous people whom we have never actually seen do whatever it is they do. From gossip columns, I “know” Evander Holyfield, Marv Albert, Lyle Lovett, Sinead O’Connor, Susan Lucci, the Smashing Pumpkins, and The Artist fka Prince. But I’m clueless about their work.
I cringe to see five-year-old Chicago kids who used to wear Michael’s (Jordan Rules, Second Coming) number 23 now wearing Dennis (Bad As I Wanna Be, Worse Than He Says He Is) Rodman’s 91. What is this generation coming to? Us, perhaps?
I consult with the Chinese woman who wants feedback on a press release for her pastor’s book. It’s about the path to holiness, heaven, wisdom. Media who get hundreds of pitches a day won’t buy these abstract notions, I warn. Americans need to know: How does that impact me?
No, it doesn’t matter that your pastor has hundreds of thousands of followers abroad; what can he say to people in Nashville? Unless … are there any famous followers? Is there a Richard Gere or a John Travolta?
She wants me to understand that her faith is grander than this, that the book’s message can’t be reduced to a sound bite. I explain that in America, she needs a hook. She blinks. Embarrassed, I tell her the comic’s bit about the evolution of our culture as seen in our magazines: From Life to People to Us to Self. She wonders how she will explain this to her people.
Is Christian celebrity oxymoronic? I want my role models to wield an iron fist in the world while keeping a foot in the kingdom. If such gymnastics cause them to stumble, they can get a good advance on their true confession. Which confirms the doubts most of us had all along. What do my heroes—and my readiness to accept, even anticipate their downfall—reveal about me?
On the secular front, consider Domestic Goddess Martha Stewart. She urges women to aspire to efforts we find laughable (e.g., making our own marshmallows), even as we succumb. We buy into her vision that we will find fulfillment by putting seasonal outfits on our lawn ornaments, even though we know better. We get a kick out of a woman who builds a vast corporate empire selling stressed-out wives/mothers household calendars that require four in service. Are we surprised to find an embittered ex-husband in the wings, waving his Martha-bashing book?
Perhaps our ambivalence about heroes indicates vestiges of a culture that identified heroism with the once-manly courage of attack. Onto that cultural heritage layer today’s worship of self-actualization—make that speedy self-actualization. Perseverance and humility now seem like wimpy virtues. “Blessed are the meek”? Not on this planet.1 Ambivalence flowers when new achievers thumb their noses at established achievers on behalf of those of us who are bored with them.
In volume 3 of his Companion to the Summa of Aquinas, Walter Farrell reminds us that when we let others provide our heroism (and perhaps our anti-heroism), we resign ourselves to abandoning the possibility of the heroic in ourselves. If I can never be a hero to my family or in my work, my life becomes one long, meaningless Maalox Moment. For my heroes out there, I cheer, “Go for it.” To the hero in me, I mutter, “Chill out.”
Yet vicarious heroism is always unsatisfying, Father Farrell points out. “Courage is as necessary for the living of human life as air, food, or drink; not only the courage of the venturesome, but also that principal courage that holds on, even when holding on is the best [a person] can do.” Trials are a part of the hero’s journey; indeed, difficulties form the hero. Human life is the real adventure, and genuine heroes are those who possess “the courage that refuses to relinquish the good.”
As a publicist, I notice how often after Jesus healed, resurrected, or exorcized he said, “See that you don’t tell this to anyone” (Mark 1:43). It was a tall and always impossible order. (How does one respond to “What happened to your leprosy?” or “Weren’t you, ah, just dead?”) So those who were healed publicized the matter, and then “Jesus could no longer enter a town openly but stayed outside in lonely places. Yet the people still came to him from everywhere” (Mark 1:45). We’re drawn to authenticity. But sometimes we want it to hang in another neighborhood (Matt. 8:34).
I see myself in the leper and in the crowd. I hope to encounter the incarnation of all that I might strive toward. I seek truth and charity in this world. Worthy books and humble authors are often a source. Still, I’m attuned to the buzz of that skeptic in the wings. Can anything good come from Nazareth? Watts? Orange County? Doubleday?
When I use the Bible as something other than a weapon for my petty agenda, I find stories of incredible human courage, of valor that amazes even the frail people who exhibit it. The ultimate Hero asks me to build that courage within my vain, idol-worshiping, fearful, mean, impatient self. Heroism is within our power, through grace. It allows us to seek out and to accept no substitutes for the good and the true in our sorry, silly, and sometimes wonderful world.
Carol DeChant is the head of DeChant-Hughes & Associates in Chicago.
1. A survey shows that belief in “the meek shall inherit the earth” is related to income. Only 36 percent of Americans with an annual income of more than $60,000 believe it. Of Americans with an annual income of less than $30,000, 61 percent believe it. (Reported in Harper’s magazine, March 1996.)
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- View Issue
- Subscribe
- Give a Gift
- Archives
At a gangly two and a quarter hours, Devil’s Advocate is without doubt history’s longest and fanciest lawyer joke, albeit a grim and sometimes floridly lurid one. On the one hand a campy, awkward melange of Rosemary’s Baby and The Firm, it is also the best lawyer film since Sidney Lumet’s 1982 classic, and really incomparable, The Verdict, in which an ambulance-chasing drunk (Paul Newman) finds a quiet but full-blown redemption. If Devil’s Advocate is any evidence, we have now concluded that lawyers are quite beyond hope, for in this film, well, the ultimate Bad Guy wins, and wins big.
In the real world, needless to say, that is not a cheerful prospect. What makes Devil’s Advocate fun is the wit and ingenuity of its incisive portrait of evil, especially of the way evil accomplishes the destruction for which it yearns. Give Nathaniel Hawthorne a smile and a camera, put him in contemporary Manhattan, and you might get something like Devil’s Advocate, only it would be a lot better.
The naive young man who runs, Hawthorne-style, into big-time evil is Kevin Lomax (Keanu Reeves), a back-country Florida lawyer who has never lost a case either as prosecutor or defense attorney. His success lies in an uncanny ability to pick sympathetic juries and, fueling that, an egotism that wins at any cost, even to the point of exonerating ugly-guilty molesters.
A record like that attracts the attention of big-time New York law firms, and Lomax finds himself and his pretty wife whisked away, with seeming magic, to the Big Apple, on which he is more than eager to feed. Wooed and beguiled and successful, Lomax lands on the fast track as the protege of head-honcho John Milton (Al Pacino), who cavorts here, barely incognito, as Old Nick himself—wily, depraved, and bemused. Milton (the name is a feeble literary joke, alluding to the poet William Blake’s contention that, in Paradise Lost, his great predecessor John Milton actually sided with Satan) spends a lot of time “in the air” or in New York’s subway underground. Get it?
But Lomax doesn’t get it. On first meeting, Milton invites Lomax to “walk with” him atop a skyscraping high-rise, a roof that seems the very fount of the world, and there Milton tempts him, body and soul, with Manhattan itself and, by extension, the whole of the world. Nor does the kid catch the clues in Milton’s peculiar habits and capacities: unsleeping, ubiquitous, clairvoyant, predatory, omnilingual, and omnisexual. Clouding Lomax’s sight, of course, is the fact that he soon gets the perks of a partner, and before he blinks, he is a partner. A fundamentalist kid from Florida should know better, even if he is on “parole” from his past.
The rub comes when Lomax’s wife, Mary Ann (Charlize Theron), gets spooked, quite literally. The opulent veneer of the ultra-posh world in which they live cracks to expose within a sink of corruption and perversity, which the film rather relishes depicting. For one, the firm’s pampered boutique-hopping wives prove to be very tony witches, having long ago joined Milton’s minions. Before long, nightmares, infertility, and eventually Milton plague the young wife, and she slides from homesickness into terror and catatonia.
All the while husband Kevin fervidly pursues the defense of a rancid mega-developer accused of killing his own wife, stepchild, and maid. So oblivious is Kevin that even Milton counsels him to leave the case to tend to his disintegrating wife. Later, says the young barrister—after he wins this really big one. And then come the surprises.
The first of these is the full revelation of Milton as Satan. What impresses is not his cartoon loathsomeness, which he will finally repulsively exhibit, but his integrity and moral intelligence. The Devil is smart, fair-minded, honest, and patient, which is a lot more than he allows for God. In dealing with humankind, he does not so much overtake or possess people as entice them, providing the occasion for them to achieve what their darkest hearts really want.
Thus, when Lomax indicts Milton for Mary Ann’s dire fate, the Devil not only denies responsibility but pins the crime on Lomax, quoting the young man back to himself in Lomax’s own voice, emphasizing that the husband could have saved Mary Ann at any time; instead, he was “involved with someone else—yourself.” To be sure, Lomax is very sharp, slick, pretty, and successful—all those traits his culture esteems as redemptive—but he selfishly makes his own fate and, worse still, Mary Ann’s doom. Self-love is, after all, as Milton tells him, the one sure-fire “all-natural opiate.” Alas, Lomax does what comes naturally, just being a good lawyer, which he interprets to mean winning, always. He proves an easy mark, for as the Devil likes to repeat, “Vanity is my favorite sin.”
For all the wonders of his guile, Milton-Satan is nonetheless not nice. Satan wants it all—namely, the death of Heaven—and he gleefully subverts everything he can in order ultimately to invert the God-ordained order of Love. His hope is that with enough lawyers running around, Satan’s “new priesthood” of numberless Kevin-clones, Earth’s stench of corruption will reach to Heaven and suffocate All-Goodness.
The tough part is that Milton’s demeanor belies this: the firm looks properly corporate, even though it smuggles arms, deals in chemical weapons, dumps toxic waste, and launders drug money, and Milton himself is debonair, caring, and witty, an attractive, pleasant, fatherly fellow. Only after a long while does it come clear, and then very jarringly, that Milton is nonetheless the Eternal Abuser, who dangles sweet blandishments to entice the curious to hideous fates, which are here rather too graphically presented (the movie is not for the tender, earning every bit of its R rating).
The biggest surprise comes in the late revelation, too late for Lomax, that Milton is Lomax’s very own father. The truth is that 30 years before, Lomax’s fundamentalist mother was seduced by a Bible-spouting waiter at a youth convention in New York City. And now Milton would have his son mate with his half-sister, predictably another lawyer, to conceive the anti-Christ, and that indeed would be the beginning of the real End.
The penultimate Hawthornian twist comes when Lomax suddenly awakes, back in Florida, to find that this personal cataclysm has been but a dream, a nightmare from who-knows-where. Chastened by his glimpse of the hell he wrought, he opts to abandon law, home free at last, or so he thinks. In one last ironic slice that would gladden the sober heart of Hawthorne, the ending makes crystal clear that the Devil has not forsaken young Kevin but will have another go at that thirsty ego, more vulnerable than ever amid its presumption of reform and rectitude.
Devil’s Advocate is by no means a great picture, especially as it sprawls around and turns very talky. Much of that talk is pop theology from Satan’s point of view (God is the “Prankster” putting together a “cosmic gag-reel”), but this exposition mostly comes in a rush at the end when the screenplay tries to clarify all those ideas it hasn’t gotten around to dramatizing.
More bothersome still is the extent to which the film exhibits the very crassness that it so vehemently indicts in lawyers. Repeatedly director Taylor Hackford, with his characteristic taste for histrionics, supplies large measures of sex, foul language, and especially violence, which seriously undercut the sardonic tone of the film. A heartbeat after some gruesome display the story cuts to wise-cracking Milton. Reeling still from the last shot, the audience is supposed to laugh. Further, Hackford really seems to think more is more to the extent of pushing nasty voyeuristic staging and camera movement. Audiences move from observation, containing some degree of distance, to inescapable participation, and that crosses an increasingly fuzzy cinematic boundary. Hackford could learn something from Hawthorne about indirection and suggestion in fathoming the darkness of souls.
Still, despite this frequent ham-fistedness, Devil’s Advocate works pretty well, and given the prevailing fluff in theaters, it is well worth a couple of hours in the dark. To be sure, clumsy cautionary tales are better than none at all.
Roy Anker is professor of English at Calvin College.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromRoy Anker
Cornelius Plantinga, Jr.
It’s hard to be full of grace when you’re full of fear.
- View Issue
- Subscribe
- Give a Gift
- Archives
In church the other Sunday,” said the humorist Erma Bombeck,
I was intent on a small child who was turning around smiling at everyone. He wasn’t gurgling, spitting, humming, kicking, tearing the hymnals, or rummaging through his mother’s handbag. He was just smiling. Finally his mother jerked him around and in a stage whisper that could be heard in a little theatre off Broadway said, “Stop that grinning! You’re in church!” With that, she gave him a belt and as the tears rolled down his cheeks added, “That’s better,” and returned to her prayers.
Early in his new book on grace, Philip Yancey quotes Bombeck to illustrate a troubling anomaly, namely, that while the Christian church’s treasure is the gospel of grace, church people don’t seem very happy about it. It’s not as if they haven’t encountered grace. Church people encounter grace all the time. They get their sins forgiven by grace and their lives regenerated. They hear of grace in sermons and receive it by means of sacraments. Their preachers greet and dismiss them with fine little bursts of grace. In between, people in church sing of grace: “Amazing grace,” they sing, “how sweet the sound, that saved a wretch like me.”
What’s amazing, says Yancey, is that, with all this grace abounding, we Christian people are often pretty graceless. We entangle ourselves in fussy legalisms that almost guarantee hypocrisy. We major in relatively minor matters of law and miss the weighty demands of justice (Yancey quotes a church official who, upon his return from Germany in 1934, reported with admiration that Hitler didn’t drink or smoke and that he liked to have women dress modestly). Moreover, we are ungenerous in our judgments and sometimes downright nasty. We write appalling letters to people with whom we disagree, demonstrating a combination of resentment and self-righteousness (the elder brother syndrome) that disqualifies us both to receive God’s love and also to pass it along to others.
Isn’t this odd? If grace is the church’s business, why don’t churches get about their business? Why don’t they try to “outgrace their rivals”? Maybe one reason, says Yancey, is that evangelicals (the main group he has in mind when he speaks of the church) have gotten swept up into power politics. Their idea is not to preach the gospel but to pass a law or elect a candidate. And it’s tough to show grace when you are lobbying for a law, or when you are painting a bad enough face on a political opponent that people will reject her. Maybe another reason is that we conservative Christians are full of fear. We think the country is sliding to hell, and that somebody ought to arrest it. We think that indecency is riding high, and that somebody ought to unhorse it. It’s hard to be full of grace when you are full of fear.
The result, says Yancey, is that even though its main business is grace, the church spends an awful lot of time “stigmatizing homosexuals, shaming unwed mothers, persecuting immigrants, harassing the homeless,” and seeing to it that lawbreakers get properly punished. But what about the courage to call a sin a sin in this lawless and self- indulgent age? Yancey knows the tension between justice and grace very well and speaks of it eloquently. What do you say to a person you love who has done something very wrong? How can you forgive a person who has slain your child? How can you forgive a person that you’d like to slay? How do we handle the phenomenon that Robert Farrer Capon observes, namely, that if we show grace to someone, the recipient may then take this as permission to minimize or even to repeat his offense?
In this connection, Yancey mentions a man who intended to dump his faithful wife for a younger one and who also planned in advance to be sorry afterwards and to ask various offended parties for forgiveness. What’s the right approach to such a person? The author has a proposal, and a very thoughtful one. Indeed, thoughtfulness is a consistent quality of this book, not the least in its insistence that if forgiveness of some heinous offenses seems outrageous, the alternative—round upon round of retaliation—is even more outrageous.
Throughout, Yancey whittles the distinction between grace and ungrace to a very sharp point. He wants to know, for example, why evangelical Christians make so much of the sins of elective abortion and homosexual practice and so little of the sins of pride, mercilessness, and self-righteousness. Does this come anywhere near Jesus’ pattern?
We have made our peace with divorce—maybe too readily. We have come to terms with greed. Why are we so tough on homosexuals? Could it be that we think of real sin as something alien, something “over there,” something apart from us, something queer? Could it be that when we feel the urge to confess sin we feel first the urge to confess the sins of others?
What’s So Amazing About Grace?
by Philip Yancey
Zondervan
292 pp.; $19.99
In a truly virtuoso chapter (“No Oddballs Allowed”) we get a lesson in biblical habits of the heart where queerness is concerned. Peter has a vision in Acts 10 of a sheet descending from heaven that is full of unclean animals, reptiles, and birds. As an observant Jew, he is appalled at this dirty dream and still more at the accompanying imperative: “Get up, Peter. Kill and eat.” For a contemporary parallel, says Yancey, imagine a convention of Southern Baptists in Texas Stadium. A fully stocked bar descends onto the playing field and a big voice from heaven booms to all the teetotalers, “Drink up!”
The old biblical rule was that Gentiles, women, bastards, the blind, the lame, the crippled, dwarfed, or crazy—all these, plus people with skin diseases, people who had touched a corpse, and guys with damaged testicles—were all unkosher. What’s striking in the New Testament is that these are the sorts of people Jesus goes to. These are the people Jesus touches, heals, affirms, forgives. And when the apostles spread the gospel, it’s strikingly a gospel of grace for the unkosher—a gospel for Gentiles, women, and Ethiopian eunuchs.
Given this huge New Testament novelty, this push toward aliens, why is the church so uptight with its grace? This is Yancey’s persistent question, and he asks it a dozen ways. The reader who begins this book may wrongly suspect it of being pretty. Perhaps, the reader thinks, the author will string out a hundred inspiring stories and anecdotes. Maybe, all told, we’ll have more illustrations than the line of analysis can hold—more beads than string.
Indeed, this book does contain a wealth of terrific tales, both of grace and of ungrace. It contains anecdotes, commentary on books and films, observations of national and international events, and imaginative retellings of biblical parables. Some of these things are as lovely as a song. But all this illustrative material is disciplined by the book’s topic and purpose. Yancey wants us to know the church’s treasure, to taste and see that it is good. But he also wants us to know the church’s untreasure, to taste and see that it is graceless. To do these things he has to make judgments.
I’m delighted to say that these judgments show the same graciousness the book praises. A lot of the judgments show up in the interrogative mood. Some of them show up as the conclusion of “I wonder … ” statements. A few of them show up as the author’s personal confession. This last genre makes demands on an author’s honesty. What we often get, says Fred Craddock, is “those familiar dramas of disguise in which the [writer] boasts of weaknesses and humbly confesses strengths.” But Yancey’s reminiscences are straight and clean, and some of them are painfully revealing.
This whole book has a kind of crispness to it that stiffens the narrative against the sort of sentimentality or overripeness that otherwise threatens to creep over a treatment of grace. The author knows that the grace of God is free, but it’s never cheap. In fact, grace often comes to us at terrible cost and is held out to us in bloody hands. Indeed, grace can seem bizarre, but without it there is no gospel—nothing to preach, nothing to sing, and nothing to take to heart when we are sick of our guilt and shame.
Cornelius Plantinga, Jr., is dean of the chapel, Calvin College. He is the author of a number of books, including Not the Way It’s Supposed to Be: A Breviary of Sin (Eerdmans).
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromCornelius Plantinga, Jr.
Interview by Michael Cromartie
How the “party of exposure” came to dominate modern culture.
- View Issue
- Subscribe
- Give a Gift
- Archives
On the crowded rack at the local superstore, not far from the cover of a men’s magazine showing an extraordinarily beautiful young woman with fetchingly unzipped jeans, and about three feet north of a colorful array of gay magazines, the headline for the cover story from the Nation (Nov. 24, 1997) caught the browser’s eye: “THE NEW PURITANISM.” (The story, by John Leonard, took off from the failure of the new movie version of Lolita to find an American distributor.) Yeah, those New Puritans are really on the warpath. Who knows where the iron hand of repression will strike next?
It is not news that we live in a show-all, tell-all culture, where one of the year’s most talked-about books-an instant best-seller-is a woman’s memoir of incest with her father, carried on into her adulthood and recounted in lascivious detail, and where jaded 14-year-olds with a library of videos and cds have already seen and heard everything. Yet even among those of us who are repelled and disheartened by such excesses, there are many who would be loath to return to the conventions of a century ago, if such a return were possible.
How did we get here? That is the subject of Rochelle Gurstein’s important book, The Repeal of Reticence: A History of America’s Cultural and Legal Struggles over Free Speech, Obscenity, Sexual Liberation, and Modern Art (Hill & Wang), which traces the triumph of the “party of exposure” from the late nineteenth century to the 1960s. For anyone who wants to understand the peculiar logic of our culture, and especially for those who share the conviction that the assault on privacy has had a disastrous impact on the public sphere, Gurstein’s meticulously documented study is essential reading.
Michael Cromartie interviewed Gurstein in October 1997 in New York, where she teaches at Bard College’s Graduate Center.
In your book you quote Hannah Arendt: “[T]he activity of taste decides how this world is to look and sound, what men will see and what they will hear in it.” And then you conclude that “the public sphere has degenerated into a stage for sensational displays of matters people formerly would have considered unfit for public appearance.” I liked the two phrases you use to explain how this loss of taste and judgment has occurred: the “party of reticence” and the “party of exposure.” Can you define them for us?In the last quarter of the nineteenth century, new agencies of exposure suddenly appeared: invasive journalism, a new kind of fiction that prided itself on its unflinching realism, and a new kind of discussion about intimacy and sex through sex education. I call the people who championed the attitudes underlying these changes-and those who continue to champion such causes today-“the party of exposure.” Their opponents are “the party of reticence.” Today the latter are likely to be dismissed as “Victorians,” the epitome of all that is prudish and backward-looking. What I found instead was that there was a whole rich language that they had developed over many years, and that these new threats to privacy sharpened their self-awareness of beliefs that they had simply taken for granted.
Did these three engines of exposure-invasive mass journalism, the realist novel, and social reformers who promoted sex education-appear at the same time, more or less independently of one another?Invasive journalism had predecessors before the Civil War, but it took a new kind of journalism-mass-circulation journalism-to develop the institutionalized prying into the lives of the rich and famous that began to flourish in the latter part of the nineteenth century. So, yes, these developments occurred simultaneously, and, yes, they were largely independent of one another. When I was doing my research for this book, I came across them in separate contexts. For the most part, these were three distinct discourses. What was striking to me is that most of the participants in these debates didn’t see the connections. And yet all three discourses centered on the question of what sort of things should appear in public.
So your book is really about what should be allowed in public and what should be kept private, and the tragic consequences of failing to keep these distinctions clear. And these distinctions started to break down when the party of exposure began to shine a light in all the dark places.Yes, in the name of freedom. In their view, what was private was hidden-something was being covered up. They saw themselves as liberators.
Did the party of exposure have an idealistic view of human nature?Yes, certainly the first generation did. They lacked a real sense of evil in the world. And this is why, for example, Agnes Repplier, who wrote the essay “The Repeal of Reticence” in 1913 from which I take the title of my book, was so appalled by the party of exposure. She was particularly scathing about the incorrigible naivete of the sex reformers. Even H. L. Mencken, whom I put on the side of the party of exposure, had only contempt for the sex reformers. He has that wonderful line about “making the unknown not worth knowing.”
The view of human nature held by the party of reticence was quite different.Yes. In part, that was based on a religious foundation, as we can clearly see in the late twentieth-century heirs of the party of reticence. But I think there is also a secular version of it, which simply insists that there are aspects of bodily experience that need the protection of privacy and that, if exposed, dehumanize people and degrade them. The party of reticence was also strongly influenced by the nineteenth-century cult of domesticity, with its notion that the home is sacred. The outcry against the early practitioners of invasive journalism was that they were violating the sacred precincts of domesticity. Now, there was a purely conventional aspect to this reaction, but I think there was also a genuine recognition that something very important and very fragile and mysterious takes place “at home,” and that you can’t violate that sphere of privacy with impunity.
You say in your book that there is a deep structure in our consciousness of the shameful and the sacred.This has to do with exposure and concealment. There are deep human instincts for modesty and privacy, which transcend cultural and historical differences. So, for example, among people whose conventions of dress are much different from our own, those instincts are still operative. It’s no accident that when people want to humiliate other human beings whom they have in their power, to destroy their self-respect and their resistance, they often strip them naked. In our society, the party of exposure continues to insist that this taboo or that taboo is merely a vestige of repression or intolerance or superstition. But the threshold has to keep being raised. The taboos that the late nineteeth-century reformers overthrew seem really quaint today. So whether you look at the art world, or pornography, or the news, you see in our society an escalation of the assault on reticence. One of the things that people used to say in the nineteenth century, which I think is true, is that the more you are exposed to unseemly matter, the more easily you get used to it, so that you don’t even notice it. One of the defining qualities of sophisticated modern people is that nothing shocks them, as opposed to the party of reticence, where showing shock or blushing was a sign of refinement.
You suggest that the party of exposure didn’t realize how fragile our private and intimate lives are. What do you mean by the “fragility of intimacy”?Again, it is this question of scale, which comes from the classical notion of genres. Private life, both love and also the activities of the body, are things that are slight in scale but important in their proper sphere. To speak about them either in a reductive scientific fashion or in a too casual manner deprives them of their importance. One of the criticisms of the sex reformers is the scientific quality of their language and the flattening that goes with that.
Before the turn of the century, people could only speak about sexual intimacy as either lust or love-it had a moral component built into it. What the sex reformers tried to do in the name of freeing people from Freudian neuroses was to split off sex from the valuation of shame or lust or love. This didn’t make sense to the party of reticence. They rejected the notion that we start with a fact of biology and then clothe it with some kind of value. Rather, the value and the fact are one.
You suggest that the struggle between exposure and reticence was essentially decided by the 1930s. Now this will probably surprise most readers, who would tend to assume that the party of reticence didn’t lose the battle until the 1960s.One piece of evidence I put forward is a quotation from a man who had become the president of the New York Society for the Suppression of Vice. He suggested that obscenity and vulgarity can be split apart, that obscenity belongs to morality and vulgarity to taste. So he said, in 1935 after the Ulyssesdecision:
Tastes necessarily differ, and with books that simply offend good taste and are hopelessly vulgar (and not flagrantly pornographic, obscene, or immoral), we cannot in this age, when former notions of propriety and decency have so radically changed, attempt to take restraining steps which might not meet with broader views now taken by our courts. … Times have changed and we must change with them.
In practice, this distinction between obscenity and vulgarity was fatal to the party of reticence. What happened was that all sorts of things that would have been frankly labeled as immoral by social consensus were suddenly redefined as mere matters of taste. If the party of reticence had still had some life in it, it would have contested this distinction much more vigorously. But it didn’t, and so the battle was essentially lost.
Did the triumph of the party of exposure result in a culture that its founders had no intention of creating?It’s hard to imagine that when they were thinking about free speech and the First Amendment they had in mind violent rap groups like 2 Live Crew. I think if many of the reformers from the late nineteenth century and the early twentieth century could see the videos that many kids watch today, they would be horrified.
You cite an interesting reaction at the end of the 1960s by the lawyer Morris L. Ernst. What were his second thoughts?Ernst was a prominent free-speech advocate who was the lawyer in the Ulysses trial in 1933-34 as well as many other famous cases. He was quoted in the New York Times in January 1970, saying that when he defended Ulysses and the right to use four-letter words, he did not have live sex onstage in mind. He was appalled by what he saw in American culture in the 1960s. This was a man who had written many books as an uncompromising libertarian, and who had always sought to portray himself as a free thinker and free liver. So for him to say toward the end of his life that he had second thoughts-that this is not what he had in mind-was really astounding.
Am I right in saying this problem of exposure is going to be hard to deal with through the law because the law lacks a vocabulary to analyze the more amorphous subject of our common life together?Yes, I believe that very strongly. The law has failed to control obscenity. In writing my book and thinking about these questions, I have come to the conclusion that the language of rights and interests, harms and victims, is always looking for a specific person who has been hurt. So it is very difficult then to say, What’s the public dimension? We have become captive to this new way of thinking-that you need to produce an actual person who has been harmed, a “real victim.” The same sort of argument is used with regard to environmental pollution. It’s clear to me that we are all suffering as a consequence of the triumph of the party of exposure, but unless you can say, Here’s someone who has cancer because of this thing, it’s hard to get any control on the polluters.
You talk about the “pollution of public space.” Explain what that means.That came from reading nineteenth-century debates about obscenity. Pollution and contamination were words that they used to describe the consequences of obscenity. Often they were speaking of invasive journalism in this context, but I was very interested to find that in early obscenity trials of the 1870s, ’80s, and ’90s, the courts would refuse to put the actual obscene passages into their records. The defendants would then make the case that the charges against them were not clearly specified. But the courts time after time would respond that they could not permit the actual obscene words to be introduced into the public record because to do so would be to pollute it. This idea of pollution or contamination of public space is something we need to revive.
There is a movement among some conservative religious communities to employ economic boycotts in an effort to restrain the party of exposure. What kind of difference do you think boycotts might make to cause Hollywood, for example, to be a little more reticent?The standard response of filmmakers and music producers and other purveyors of culture today is that they are simply giving people what they want. A really effective boycott will show them that they have misjudged the market, and they will be forced to adjust, not out of moral conviction but out of self-interest. I think that those kinds of local responses are very good.
Local more than national?Well, national would be good, too. But I think that local is better in the sense that people are more involved. There are problems with a national boycott of Disney or Hollywood or Madison Avenue-it’s hard to boycott entities that operate on such a vast scale. And I think the people who decide to protest in this way need to make it clear on what grounds they are boycotting.
The Repeal of Reticence:
A History of America’s Cultural and Legal
Struggles over Free Speech, Obscenity,
Sexual Liberation, and Modern Art
by Rochelle Gurstein
Hill & Wang
357 pp.; $27.50
You speak of the need for the recovery of taste and judgment. What are your thoughts on censorship?As I’ve said, our legal language as it now exists lacks the resources to talk about the real harm to the public sphere. The courts as they are currently thinking of this question can’t address what’s important. It’s very hard for me to imagine that censorship would help. Censorship frames the loss of reticence as a First Amendment question, and I would want to get away from that. There have been recent attempts by Catherine MacKinnon and other radical feminists who want to shift the emphasis away from the First Amendment and say that it’s a question of the Fourteenth Amendment-that obscenity and pornography reinforce a system of inequality at large against women. I prefer that approach to the First Amendment argument, but I still don’t think it’s the right move. I think in the end it ends up trivializing what the Fourteenth Amendment is.
Some people argue that obscene materials are really not a matter of protected speech. But that argument is not being won today, is it?Well, I’m heartened by the connections that are being made between hate speech and pornography. There is some good work being done where, in the face of the cult of free speech, people are making the point that there is a lot of speech that’s already prohibited: seditious speech, libel, and treason, for example. And if these forms of speech are already prohibited, you cannot defend pornography on the basis of an absolute freedom of speech. Why then should pornography be permitted? Why should hate speech be permitted when there are already existing restraints on speech? You’re right, these arguments are not carrying the day, but I am happy to see that there is already a group of people who aren’t accepting libertarian thought without questioning it.
What are your thoughts about the so-called Yale Five, the Orthodox Jewish students at Yale who are demanding that they not be required to live in coed dorms because the environment-with bags of condoms hanging from the doors, and so on-is clearly at odds with their religious beliefs? Is that an example of the battle between the party of reticence and the party of exposure?Those students, I would imagine, knew that coed living was part of the deal of going to Yale. So one could say, when they made their choice to go to Yale, they knew what was in store for them. On the other hand, one can rightly ask, Why doesn’t Yale have noncoed dormitories? It doesn’t seem to be such an outrageous request, not only for people whose religious beliefs are being trampled upon and assaulted, but for people who may just be modest and who may not want to be seen by their own sex, let alone someone of the opposite sex. So I’m very surprised that Yale doesn’t even have that option.
How does society recover a cultural reticence?I am not very hopeful that one can. We’ve all been polluted and alienated in some way, even people who want to be part of the reticent sensibility, and we make these distinctions self-consciously. I think it’s a world that’s lost, and that those of us who are sympathetic with it can take inspiration from it as an ideal. I must say in a confessional tone (which is inappropriate to an author on reticence), when I started writing this book I was not in the party of reticence. I was in the party, not exactly of exposure, but more comfortable with the idea of exposure.
And what happened?In reading-this is only way I think that reticence can be reborn: in reading-I came to hold to positions that were foreign to myself.
You began to see through your research the debilitating effects of the party of exposure all around us?I think I began with the premise that our world was ugly. And that there was something wrong. But it was a revelation to me to discover this discussion about shame and sacredness. Take the case of Mapplethorpe, for example, and the debate about him. I felt very uncomfortable finding something wrong with Mapplethorpe because of the pressure to be sophisticated and modern. But I was convinced that there had to be another way. The consequence of not having a sense of shame, and believing that nothing is sacred, is a world that looks like ours. And I find that very distressing.
Michael Cromartie directs the Evangelical Studies Project at the Ethics and Public Policy Center in Washington, D.C.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromInterview by Michael Cromartie
by Philip Yancey
The question is not why modern secularists oppose traditional morality; it is on what grounds they defend any morality.
- View Issue
- Subscribe
- Give a Gift
- Archives
A representative of Generation X named Sam told me he had been discovering the strategic advantages of truth. As an experiment, he decided to stop lying. “It helps people picture you and relate to you more reliably,” he said. “Truth can be positively beneficial in many ways.” I asked what would happen if he found himself in a situation where it would prove more beneficial for him to lie. He said he would have to judge the context, but he was trying to prefer not-lying.
For Sam, the decision to lie or tell the truth involved not morality but a social construct, to be adopted or rejected as a matter of expedience. In essence, the source of moral authority for Sam is himself, and that in a nutshell is the dilemma confronting moral philosophy in the postmodern world.
Something unprecedented in human history is brewing: a rejection of external moral sources altogether. Individuals and societies have always been im-moral to varying degrees. Individuals (never an entire society) have sometimes declared themselves amoral, professing agnosticism about ethical matters. Only recently, however, have serious thinkers entertained the notion of un-morality: that there is no such thing as morality. A trend prefigured by Nietzsche, prophesied by Dostoyevsky, and analyzed presciently by C. S. Lewis in The Abolition of Man is now coming to fruition. The very concept of morality is undergoing a profound change, led in part by the advance guard of a new science called “evolutionary psychology.”
So far, however, the pioneers of unmorality have practiced a blatant contradiction. Following in the style of Jean-Paul Sartre, who declared that meaningful communication is impossible even as he devoted his life to communicating meaningfully, the new moralists first proclaim that morality is capricious, perhaps even a joke, then proceed to use moral categories to condemn their opponents. These new high priests lecture us solemnly about multiculturalism, gender equality, homophobia, and environmental degradation, all the while ignoring the fact that they have systematically destroyed any basis for judging such behavior right or wrong. The emperor so quick to discourse about fashion happens to be stark naked.
For example, George Williams wrote a landmark book in 1966 entitled Adaptation and Natural Selection, which portrayed all behavior as a genetically programmed expression of self-interest. Yet later, after examining some of the grosser examples of animal behavior, he concluded that “Mother Nature is a wicked old witch. … Natural selection really is as bad as it seems and … it should be neither run from nor emulated, but rather combatted.”
Williams neglected to explain what allowed him, a product of pure natural selection, to levitate above nature and judge it morally bankrupt. He may understandably disapprove of animal cannibalism and rape, but on what grounds can he judge them “evil”? And how can we—or why should we—combat something programmed into our genes?
Lest I sound like a cranky middle-aged moralist, I should clarify at the beginning that to me the real question is not why modern secularists oppose traditional morality; it is on what grounds they defend any morality.
We hold these truths to be probable enough for pragmatists, that all things looking like men were evolved somehow, being endowed by heredity and environment with no equal rights but very unequal wrongs … Men will more and more realize that there is no meaning in democracy if there is no meaning in anything. And there is no meaning in anything if the universe has not a center of significance and an authority that is the author of our rights.
—G. K. Chesterton
In a great irony, the “politically correct” movement defending the rights of women, minorities, and the environment often positions itself as an enemy of the Christian church when, in historical fact, the church has contributed the very underpinnings that make such a movement possible. Christianity brought an end to slavery, and its crusading fervor also fueled the early labor movement, women’s suffrage, human-rights campaigns, and civil rights. According to Robert Bellah, “there has not been a major issue in the history of the United States on which religious bodies did not speak out, publicly and vociferously.”
It was no accident that Christians pioneered in the antislavery movement, for their beliefs had a theological impetus. Both slavery and the oppression of women were based, anachronistically, on an embryonic form of Darwinism. Aristotle had observed that
Tame animals are naturally better than wild animals, yet for all tame animals there is an advantage in being under human control, as this secures their survival. And as regards the relationship between male and female, the former is naturally superior, the latter inferior, the former rules and the latter is subject. By analogy, the same must necessarily apply to mankind as a whole. Therefore all men who differ from one another by as much as the soul differs from the body or man from a wild beast (and that is the state of those who work by using their bodies, and for whom that is the best they can do)—these people are slaves by nature, and it is better for them to be subject to this kind of control, as it is better for the other creatures I have mentioned . …It is clear that there are certain people who are free and certain people who are slaves by nature, and it is both to their advantage, and just, for them to be slaves. … From the hour of their birth, some men are marked out for subjection, others for rule.
Cross out the name Aristotle and read the paragraph again as the discovery of a leading evolutionary psychologist. No one is proposing the reimposition of slavery, of course—but why not? If we learn our morality from nature, and if our only rights are those we create for ourselves, why should not the strong exercise their “natural rights” over the weak?
As Alasdair MacIntyre remarks in After Virtue, modern protesters have not abandoned moral argument, though they have abandoned any coherent platform from which to make a moral argument. They keep using moral terminology—it is wrong to own slaves, rape a woman, abuse a child, despoil the environment, discriminate against homosexuals—but they have no “higher authority” to which to appeal to make their moral judgments. MacIntyre concludes,
Hence the utterance of protest is characteristically addressed to those who already share the protestors’ premises. The effects of incommensurability ensure that protestors rarely have anyone else to talk to but themselves. This is not to say that protest cannot be effective; it is to say that it cannot be rationally effective and that its dominant modes of expression give evidence of a certain perhaps unconscious awareness of this.
In the United States, we prefer to settle major issues on utilitarian or pragmatic grounds. But philosophers including Aristotle and David Hume argued powerfully in favor of slavery on those very grounds. Hitler pursued his genocidal policies against the Jews and “defective” persons on utilitarian grounds. Unless modern thinkers can locate a source of moral authority somewhere else than in the collective sentiments of human beings, we will always be vulnerable to dangerous swings of moral consensus.
A man who has no assured and ever-present belief in the existence of a personal God or of a future existence with retribution or reward, can have for his rule of life, as far as I can see, only to follow those impulses and instincts which are the strongest or which seem to him the best ones.
—Charles Darwin
Christina Hoff Sommers tells of a Massachusetts educator attempting to teach values-clarification to her class of sixth-graders. One day her canny students announced that they valued cheating and wanted the freedom to practice it in class. Hoist with her own petard, the teacher could only respond that since it was her class, she insisted on honesty; they would have to exercise their dishonesty in other places. In view of such an approach to morality, should it surprise us to learn from surveys that half of all students cheat? What restrains the other half?
What makes a person good? What is “good” anyway? Moral philosophers such as Charles Taylor and Alasdair MacIntyre argue convincingly that many people in the modern world can no longer answer that question coherently.
A friend of mine named Susan, a committed Christian, told me that her husband did not measure up and she was actively looking for other men to meet her needs for intimacy. When Susan mentioned that she rose early each day to “spend an hour with the Father,” I asked, “In your meetings with the Father, do any moral issues come up that might influence this pending decision about leaving your husband?”
Susan bristled: “That sounds like the response of a white Anglo-Saxon male. The Father and I are into relationship, not morality. Relationship means being wholly supportive and standing alongside me, not judging.” I gently pointed out that we all make judgments in our relationships. Had not she judged her husband incapable of meeting her needs? Susan fended off my arguments, and we moved on to more congenial topics.
Like many moderns, my friend Susan has moved the locus of morality from an external to an internal source, a change that traces back to the Romantic movement and its new celebration of the individual. In his essay “Self-Reliance,” Ralph Waldo Emerson proclaimed that everyone should “Trust thyself,” for divinity resides in every person. What if your intuitions are evil? Emerson did not back down: “They do not seem to me to be such; but if I am the devil’s child, I will live then from the devil. No law can be sacred to me but that of my nature.”
Rousseau, a grandfather of Romanticism, had followed the dictates of his heart by abandoning five infants born to his illiterate servant-mistress. Of course, one could find many such scandalous incidents before the outbreak of Romanticism. The real change was more subtle and subterranean. From Aristotle onward, the West had always perceived “the good” as an external code, neither mine nor yours. Though one could choose to break the code, it remained an external code above and beyond the reach of any individual. With Romanticism, the code moved inside so as to become radically subjective. The individual self began writing his or her own moral script.
Nearly two centuries after the flowering of Romanticism, we are witnessing the consequences of that unmooring of the moral code. In a strange twist, whereas Augustine viewed evil as a perversion of good, modern ethicists view goodness as a manifestation of selfishness. Everything we do, including every act of nobility or altruism, serves a hidden purpose: to enhance oneself or to perpetuate genetic material. Challenged to explain Mother Teresa’s behavior, sociobiologist Edward O. Wilson pointed out that she was secure in the service of Christ and in her belief in immortality; in other words, believing she would get her reward, she acted on that “selfish” basis.
Robertson McQuilkin, a college president who resigned in order to care for his Alzheimer’s-afflicted wife, attended a seminar in which a researcher reported that, in her study of 47 couples facing terminal illness, she had predicted with 100 percent accuracy who would die soonest, simply by observing the relationship between husband and wife. “Love helps survival,” she concluded. From there, McQuilkin went directly to another session in which an expert listed reasons why families might choose to keep an ailing family member at home rather than in a nursing facility. Noting that the reasons all boiled down to economic necessity or guilt feelings, McQuilkin asked, “What about love?” “Oh,” replied the expert, “I put that under guilt.”
While redefining goodness, modern society has simultaneously discarded the notion of sin. In the movie Ironweed, Helen, an alcoholic, informs God at a candlelit altar, “You may call them sins; I call them decisions.” Increasingly, bad actions are seen as neither sins nor decisions, rather as the outworking of behavior patterns hard-wired into our brains. A murderer goes free on the grounds that eating Twinkies contributed to his mental instability; a national authority excuses political consultant Dick Morris’s adultery as the normal biological response to an environment of power and status.
I have already mentioned that scientists who dismantle any notion of good and evil nevertheless must fall back on those categories of judgment. This kind of moral schizophrenia expresses itself at every level of society. We must cling to some form of morality or both person and society will swirl apart. Yet individuals find themselves unable to articulate a code of morality, and even less able to keep any code. Abbie Hoffman, a radical leader in the 1960s, complained, “I’ve never liked guilt-tripping. I’ve always left the concept of sin to the Catholic Church. When I was four, my mother said, ‘There’s millions of people starving in China. Eat your dinner.’ I said, ‘Ma, name one.’ ” Yet this rebel against guilt trips ran a distinctively “moral” campaign against a repressive society and an unjust war.
After interviewing average Americans to determine why they behave the way they do, Robert Bellah and his associates came up with a primary ethic of “self-fulfillment.” Bellah acknowledges that most people want to be “good” even though few can articulate a reason for it. In their roles as parents, spouses, and citizens, ordinary people demonstrate qualities of sacrifice, fidelity, and altruism. They act, in Bellah’s opinion, out of “habits of the heart” rooted primarily in America’s Christian heritage. Remove those habits of the heart, and the true pathology of modern times comes to light.
Indeed, psychopaths represent the group that acts most consistently to the new code of “unmorality.” Immune to social pressures, these deviants live out the courage of their nonconvictions.
“Character,” says Robert Coles, “is how you behave when no one is looking.” Coles goes on to suggest that for the conscientious, those with a highly developed moral sense, “someone is always ‘looking,’ even if we are as solitary as Thoreau at Walden.” But for the psychopath or sociopath, the “unmoral” person, no one is ever looking. The unmoral person believes in no outside source of moral authority and inside hears only the “terrible silences of an emotionally abandoned early life or the demonic voices of a tormented childhood.”
Prison interviews with two mass murderers, Jeffrey Dahmer and Ted Bundy, bear out Coles’s observation. Both were asked how they could possibly do the things they did. Both replied that, at the time, they did not believe in God and felt accountable to no one. They started with petty cruelty, then moved to torture of animals and people, and then murder. Nothing internal or external stopped them from making the descent to unmorality—they felt no twinge of guilt. Ironically, both mass murderers followed to logical conclusion the principle laid down by Charles Darwin a century ago—that without a belief in God or afterlife, a person can only follow those impulses and instincts that are the strongest.
We read daily in the newspapers the tragic results of those who follow their strongest impulses. Bill Moyers asked the late Joseph Campbell what results when a society no longer embraces a religion or powerful mythology. “What we’ve got on our hands,” Campbell replied; ” … read the New York Times.”
Not only for psychopaths, but for everyday sinners, the practice of looking inside for moral guidance is fraught with danger. Woody Allen, a sophisticated, brilliant filmmaker, granted an interview to Time magazine in order to counter his wife’s accusations against sexual abuse of her children and to explain his affair with his 21-year-old adopted Korean daughter. “The heart wants what it wants,” said Allen. “There’s no logic to those things. You meet someone and you fall in love and that’s that.”
It is easy to see that the moral sense has been bred out of certain sections of the population, like the wings have been bred off certain chickens to produce more white meat on them. This is a generation of wingless chickens.
—Flannery O’Connor
What happens when an entire society becomes populated with wingless chickens? I need not dwell on the contemporary symptoms of moral illness in the United States: our rate of violent crime has quintupled in my lifetime; a third of all babies are now born out of wedlock; half of all marriages end in divorce; the richest nation on earth has a homeless population larger than the entire population of some nations. These familiar symptoms are just that, symptoms. A diagnosis would look beyond them to our loss of a teleological sense. “Can one be a saint if God does not exist? That is the only concrete problem I know of today,” wrote Albert Camus in The Fall.
Civilization holds together when a society learns to place moral values above the human appetites for power, wealth, violence, and pleasure. Historically, it has always relied on religion to provide a source for that moral authority. In fact, according to Will and Ariel Durant, “There is no significant example in history, before our time, of a society successfully maintaining moral life without the aid of religion.” They added the foreboding remark, “The greatest question of our time is not communism versus individualism, not Europe versus America, not even the East versus the West; it is whether men can live without God.”
Vaclav Havel, a survivor of a civilization that tried to live without God, sees the crisis clearly:
I believe that with the loss of God, man has lost a kind of absolute and universal system of coordinates, to which he could always relate everything, chiefly himself. His world and his personality gradually began to break up into separate, incoherent fragments corresponding to different, relative, coordinates.
On moral issues—social justice, sexuality, marriage and family, definitions of life and death—society badly needs a moral tether, or “system of coordinates” in Havel’s phrase. Otherwise, our laws and politics will begin to reflect the same kind of moral schizophrenia already seen in individuals.
On what moral basis do doctrinaire Darwinians, committed to the survival of the fittest, ask us to protect the environment, in effect lending a hand to those we make “unfit”? On what basis do abortionists denounce the gender-based abortion practiced in India, where, in some cities, 99 percent of abortions involve a female fetus? (For this reason, some Indian cities have made it illegal for doctors to reveal to parents a fetus’s gender after an ultrasound test.) Increasingly, the schizophrenia of personal morality is being projected onto society at large.
James Davison Hunter recounts watching a segment of the Phil Donahue Show featuring men who left their wives and then had affairs with those wives’ mothers. Some of the relationships failed, but some worked out fine, the men reported. A psychologist sitting on the panel concluded, “The important thing to remember is that there is no right or wrong. I hear no wrongdoing. As I listen to their stories, I hear pain.”
Hunter speculates where a society might be headed once it loses all moral consensus. “Personally I’m into ritual animal sacrifice,” says one citizen. “Oh, really,” says another. “I happen to be into man-boy relationships.” “That’s great,” responds a third, “but my preference is … ” and so on. The logical end of such thinking, Hunter suggests, can be found in the Marquis de Sade’s novel Juliette, which declares, “Nothing is forbidden by nature.”
In Sade’s novel, Juliette’s lover enhances their sexual ecstasy by raping Juliette’s daughter and throwing the girl into a fire; wielding a poker, the mother herself prevents the child’s escape. A brute accused of raping, sodomizing, and murdering more than two dozen boys, girls, men, and women defends himself by saying that all concepts of virtue and vice are arbitrary; self-interest is the paramount rule:
Justice has no real existence, it is the deity of every passion. … So let us abandon our belief in this fiction, it no more exists than does the God of whom fools believe it the image; there is no God in this world, neither is there virtue, neither is there justice; there is nothing good, useful, or necessary but our passions.
U.S. courts today take pains to decide the merits of a case apart from religion or natural law. New York State passed a law prohibiting the use of children in pornographic films and, in order to protect it from civil libertarians, specified that the law is based not on moral or religious reasons, rather on “mental health” grounds. In earlier times the Supreme Court appealed to the “general consent” of society’s moral values in deciding issues such as polygamy. I wonder on what possible grounds the Court might rule against polygamy today (practiced in 84 percent of all recorded cultures)—or incest, or pederasty, for that matter. All these moral taboos derive from a religious base; take away that foundation, and why should the practices be forbidden?
To ask a basic question, What sense does marriage make in a morally neutral society? A friend of mine, though gay, is nevertheless troubled by calls for gay marriages. “What’s to keep two brothers from marrying, if they declare a commitment to each other?” he asks. “They could then enjoy the tax breaks and advantages of inheritance and health plans. It seems to me something more should be at stake in an institution like marriage.” Yes, but what is at stake in marriage? The authors of Habits of the Heart found that few individuals in their survey except committed Christians could explain why they stayed married to their spouses. Marriage as a social construct is arbitrary, flexible, and open to redefinition. Marriage as a sacrament established by God is another matter entirely.
Feminist thinkers have led the way in questioning the traditional basis of sexual ethics. In The Erotic Silence of the American Wife, Dalma Heyn argues that women unnaturally bind themselves at the marriage altar, abandoning their true needs and desires. Heyn recommends extramarital affairs as the cure for what she sardonically calls “the Donna Reed syndrome.” In an essay in Time, Barbara Ehrenreich celebrated the fact that “Sex can finally, after all these centuries, be separated from the all-too-serious business of reproduction. … The only ethic that can work in an overcrowded world is one that insists that … sex—preferably among affectionate and consenting adults—belongs squarely in the realm of play.”
Ehrenreich and Heyn are detaching sex from any teleological meaning invested in it by religion. But why limit the experience to affectionate and consenting adults? If sex is a matter of play, why not sanction pederasty, as did the Greeks and Romans? Why choose the age of 18—or 16, or 14, or 12—to mark an arbitrary distinction between child abuse and indulging in play? If sex is mere play, why do we prosecute people for incest? (Indeed, the Sex Information and Education Council of the United States circulated a paper expressing skepticism regarding “moral and religious pronouncements with respect to incest,” lamenting that the taboo has hindered scientific investigation.)
The Alice-in-Wonderland world of untethered ethics has little place for traditional morality. When California adopted a sex-education program, the ACLU sent this official memorandum:
The ACLU regrets to inform you of our opposition to SB 2394 concerning sex education in public schools. It is our position that teaching that monogamous, heterosexual intercourse within marriage is a traditional American value is an unconstitutional establishment of religious doctrine in public schools. … We believe SB 2394 violates the First Amendment.
Again I stress, to me the question is not why modern secularists reject traditional morality, but on what grounds they defend any morality. Our legal system vigorously defends a woman’s right to choose abortion—but why stop there? Historically, abandonment has been the more common means of disposing of unwanted children. Romans did it, Greeks did it, and during Rousseau’s lifetime, one-third of babies in Paris were simply abandoned. Yet today, in the United States, if a mother leaves her baby in a Chicago alley, or two teens deposit their newborn in a Dempsey Dumpster, they are subject to prosecution.
We feel outrage when we hear of a middle-class couple “dumping” an Alzheimer’s-afflicted parent when they no longer wish to care for him, or when kids push a five-year-old out the window of a high-rise building, or a ten-year-old is raped in a hallway, or a mother drowns her two children because they interfere with her lifestyle. Why? On what grounds do we feel outrage if we truly believe that morality is self-determined? Evidently the people who committed the crimes felt no compunction. And if morality is not, in the end, self-determined, who determines it? On what basis do we decide?
In the landmark book Faith in the Future, Jonathan Sacks, chief rabbi of the United Hebrew Congregations of the (British) Commonwealth, argues that human society was meant to be a covenant between God and humankind, a collaborative enterprise based on common values and vision. Instead, it has become “an aggregate of individuals pursuing private interest, coming together temporarily and contractually, and leaving the state to resolve their conflicts on value-neutral grounds.” In the process, “the individual loses his moorings … and becomes prone to a sense of meaninglessness and despair.” Sacks argues that only by restoring the “moral covenant” can we reverse the breakdown in the social fabric of Western civilization.
Or, as the Jewish medical educator David C. Stolinsky put it, “The reason we fear to go out after dark is not that we may be set upon by bands of evangelicals and forced to read the New Testament, but that we may be set upon by gangs of feral young people who have been taught that nothing is superior to their own needs or feelings.”
The modern world seems to lack whatever principle it is that discriminates between authority and tyranny and between liberty and license. And without such a principle it appears that one can only oscillate between the two lawless extremes.
—Simone Weil
Critics of Christianity correctly point out that the church has proved an unreliable carrier of moral values. The church has indeed made mistakes, launching Crusades, censuring scientists, burning witches, trading in slaves, supporting tyrannical regimes. Yet the church also has an inbuilt potential for self-correction because it rests on a platform of transcendent moral authority. When human beings take upon themselves the Luciferian chore of redefining morality, untethered to any transcendent source, all hell breaks loose.
In Nazi Germany, and also in the Soviet Union, China, and Cambodia, the government severed morality from its roots. Nazi propagandists dismissed biblical revelation as “Jewish swindle” and emphasized instead the general revelation they observed in the natural order of creation. Lenin ordered Russians to adopt “the Revolutionary Conscience” as opposed to their natural conscience. Our century is the first in which societies have attempted to form their moral codes without reference to religion. We have had the chance to “take the world in our own hands,” in Camus’s phrase. Modern humanity, Camus said, “launches the essential undertaking of rebellion, which is that of replacing the reign of grace by the reign of justice.” The results are in: perhaps 100 million deaths under Hitler, Stalin, Mao, and Pol Pot attributable to this grand new reign of justice.
Today, of course, apart from China, the threat posed by communism has disappeared. We in the West rest secure, even triumphant. Yet the bats are out of the cage. The spiritual sources that fed both Nazism and communism are still with us.
Increasingly the schizophrenia of personal morality is being projected onto society at large.
We look back with horror on the Nazi campaign to exterminate the mentally defective. But not long ago the newsletter of a California chapter of Mensa, the organization for people with high IQs, published an article proposing the elimination of undesirable citizens, including the retarded and the homeless. Modern China requires the abortion of defective fetuses, including those diagnosed with retardation, and kills “unauthorized” babies born to one-child families. And in some states in the Unites States, due largely to pressures from insurance companies, the incidence of Down syndrome children has dropped 60 percent; the rest are aborted before birth.
In his study Morality: Religious and Secular, Basil Mitchell argues that, since the eighteenth century, secular thinkers have attempted to make reason, not religion, the basis of morality. None has successfully found a way to establish an absolute value for the individual human person. Mitchell suggests that secular thinkers can establish a relative value for people, by comparing people to animals, say, or to each other; but the idea that every person has an absolute value came out of Christianity and Judaism before it and is absent from every other ancient philosophy or religion.
The Founding Fathers of the United States, apparently aware of the danger, made a valiant attempt to connect individual rights to a transcendent source. Overruling Thomas Jefferson, who had made only a vague reference to “the Laws of Nature and of Nature’s God,” they insisted instead on including the words “unalienable” and “endowed by their Creator.” They did so in order to secure such rights in a transcendent Higher Power, so that no human power could attempt to take them away. Human dignity and worth derive from God’s.
Yet if there is no Creator to endow these rights, on what basis can they be considered unalienable? Precisely that question is asked openly today. Robert Jarvik, a scientist and inventor of the artificial human heart, expresses the more modern view:
In reality, there are no basic human rights. Mankind created them. They are conventions we agree to abide by for our mutual protection under law. Are there basic animal rights? Basic plant rights? Basic rights of any kind to protect things on our planet when the sun eventually burns out, or when we block it out with radioactive clouds? Someday, humans will realize that we are a part of nature and not separate from it. We have no more basic rights than viruses, other than those that we create for ourselves through our intellect and our compassion.
Jarvik captures the dilemma: If humans are not made in the image of God, somehow distinct from animals, what gives us any more rights than other species? Some animal rights activists already ask that question, and a writer in the journal Wild Earth even mused about the logical consequences:
If you haven’t given voluntary human extinction much thought before, the idea of a world with no people may seem strange. But, if you give the idea a chance I think you might agree that the extinction of Homo sapiens would mean survival for millions, if not billions, of other Earth-dwelling species . …Phasing out the human race will solve every problem on earth, social and environmental.
When representatives from the United States meet with their counterparts from China and Singapore to hammer out an agreement on human rights, not only do they have no common ground, they have no self-coherent ground on which to stand. Our founders made human dignity an irreducible value rooted in creation, a dignity that exists prior to any “public” status as citizen. Eliminate the Creator, and everything is on the negotiating table. By destroying the link between the social and cosmic orders, we have effectively destroyed the validity of the social order.
(Next issue: Philip Yancey on evolutionary psychology.)
Philip Yancey is the author of many books, including most recently What’s So Amazing About Grace? (Zondervan).
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromby Philip Yancey
Jean Bethke Elshtain
Raising a “challenged” child in a world that supports good, pleasant eugenics.
- View Issue
- Subscribe
- Give a Gift
- Archives
What to do with idiots, imbeciles, cretins? If the reader isn’t shocked by this opening query, something is seriously wrong. We have abandoned this language as we were once enjoined to abandon children—and adults—who got slotted into such categories. Every now and then one encounters a person who says Mongoloid idiot, even in polite company, but the effect is rather like bumping into a velociraptor on an evening stroll: Where did this extinct, unpleasant creature come from ?
Times change. And once in a while they change for the better. The seeds of decent treatment of those we call “exceptional” or, if we are being especially correct and perhaps a bit cutesy, “challenged,” are of ancient and noble lineage .
It is awfully hard to square Christian understanding of the imago Dei—we are all God’s creatures—with a ruthless or frightened determination to remove from our midst those among us who present themselves to us in bodies and with faces that don’t fit some norm. But square it all too many did, perhaps thinking: Surely God couldn’t have intended this! Surely this is a mistake! To be reminded of frailty and vulnerability and even brokenness in this way? Too much to bear. The human propensity to turn away from difficulties, whether conceptual, ethical, bodily, or social, kicks in, and we shun or dismiss or exile .
Life As We Know It:
A Father, A Family, and an an exceptional Child
by Michael Berube
Pantheon Books
284 pp.; $24
But it doesn’t end there, for the roots of mistreatment of persons with disabilities lie not just in a turning away from one understanding at its richest (what I have called “Christian anthropology”) but in embracing an alternative that embeds within it a rationale for discrimination of an invidious sort .
Consider the high premium the Enlightenment and rationalist philosophers placed on reason as the jewel in the anthropological crown: cogito ergo sum. This isn’t Christian thinking—Christian philosophers did not privilege reason in this way—but it certainly is Western and came to dominate much of our thinking. Augustine, by way of contrast, had posed doubting as a defining criterion of our humanness, that and our very creatureliness that came in many varieties. Augustine’s capacious anthropology quite readily incorporated under the definition “human “
the so-called Sciopods (“shadow-feet”) because in hot weather they lie on their backs on the ground and take shelter in the shade of their feet. … What am I to say of the Cynocephali, whose dog’s head and actual barking prove them to be animals rather than men? Now we are not bound to believe in the existence of all the types of men who are described. But no faithful Christian should doubt that anyone who is born anywhere as a man—that is, a rational and mortal being—derives from that one first-created human being. And this is true, however extraordinary such a creature may appear to our senses in bodily shape, in colour, or motion, or utterance, or in any natural endowment, or part, or quality.
A rational being Augustine defines as a creature capable of communicating with its fellows—and the doglike Cynocephali did that—but, first and foremost, a creature at once natal and mortal and aware of that fact: we are born of parents of the flesh and we die .
This didn’t cut much ice with Descartes, who viewed the body as extended machinery inessential to who I am. Troubles aplenty for those who are manifestly different from birth lurk here. What are we to make of those who appear among us in bodies that are not only distinctive, as each human body is distinctive, but bodies that mark them for a life that will not be fully human on a narrowly rationalist and disembodied account of the human condition ?
Thus, certain strenuous rationalists withheld full human status from the “idiots, imbeciles, cretins.” This doesn’t necessarily mean that they believed we should be brutal and thuggish to such pitiful creatures. We might, from our enlightened stance, create separate institutions for those with misshapen bodies and clumsy tongues and impenetrable minds—better for them and for us was the thinking .
Then there were those, most hideously embodied in the bio-politics of the National Socialist state, who believed the separation from the body politic of imperfect bodies should take the form of a radical excision: We’ll warehouse them and then kill them in the interest of good race policy and in tune with the laws of nature whereby the strong crowd out and even devour the weak .
Michael and Janet Berube, both academics, were already parents of a “normal” child. And then Jamie was born, a little boy very different from his brother, Nick. Jamie got categorized from the very first moment. He emerged from the womb (the Berubes had rejected prenatal testing) as a “bit Downsie,” in the physician’s terms: a child with Down syndrome, a retarded child, even, hideously, a Mongoloid child .
Another Season:
A Coach’s Story of Raising an Exceptional Son
Gene Stallings & Sally Cook
Little Brown
216 pp.; $22.95
How to fit the reality of Jamie, this concrete, particular little human being, into such big and, in several instances, dismissive categories? It didn’t work: Jamie didn’t fit, just as no single child ever really fits a global category. Berube, a literary theorist, gets very literarily theoretical very fast as he tries to “work” over the problem of Jamie. How has Jamie, or those in whose category he rests, been treated in literature? How are we, his parents, to represent him if we resist prior representations ?
As with any account indebted to postmodern elaborations of the trouble with representation, Berube frets (somewhat obsessively, to my mind) about whether he can in any way “represent” Jamie. Well, yes, such an obligation exists. But doesn’t this open the floodgates to the sorts of indignities long suffered by those who could not represent themselves, or at least not in the ways competent, “normal” human beings can, have, and do ?
As Berube goes on to offer an often riveting account of Jamie’s birth and the family’s coming-to-grips and Jamie’s subsequent coming-into-focus as a complex, delightful, singular child, he cannot resist dropping in the occasional political ad hominem. There are stock villains, mostly wicked Republican budget slashers. This doesn’t seem worth concentrating on, however, as such comments are political asides and not the heart of the matter. Whether one takes up the cudgels with Berube or believes things are rather more complicated than he allows, politically speaking (and this is where I would place myself), he is cetainly right that the environment for parents of young children in America today falls far short of a decent norm in far too many cases .
There has been a thinning out of the social ecology that once helped to sustain many if not all parents in their tasks and, as a result, there are far fewer helping hands; overstressed parents, both often working full-time; complex and inadequate managed-care situations; a loss of confidence in schools and in the helping professions more generally. Here the Berubes are lucky. They were enhanced in their parental vocation by good friends, and they found a number of canny, competent, caring professionals to tend to Jamie’s health and his development .
Perhaps most important, they found within themselves capacities they didn’t know existed. In common with most Down syndrome kids, Jamie had lots of health troubles beyond the anticipatable developmental delays. Michael and Janet Berube coped; no, they went much beyond coping. They became expert at a whole new craft. In his words: “If you had told me in August 1991—or, for that matter, after an amniocentesis in April 1991—that I’d have to feed my infant by dipping a small plastic tube in K-Y jelly and slipping it into his nose and down his pharynx into his teeny tummy, I’d have told you that I wasn’t capable of caring for such a child. [In other words, had they had amniocentesis, they would likely have opted for abortion.] But by mid-October, I felt as if I had grown new limbs and new areas of the brain to direct them.” He learned that “[y]ou can do this. You can cope with practically everything.” Many parents of children with disabilities make similar discoveries .
After she had read one of my “Hard Questions” columns for The New Republic in which I criticized our flight from finitude and our quest for bodily perfection and had gone on to muse over what this would mean to the developmentally “different,” the mother of a Down syndrome child who died, tragically, of a critical illness in his third year wrote me that she and her husband are enormously grateful to have had “the joyous privilege of parenting a child with Down syndrome. … Tommy’s [not his real name] birth truly transformed our lives in ways that we will cherish forever. But how could we have known in advance that we indeed possessed the fortitude to parent a child with special needs? And who would have told us of the rich rewards?” She continues :
The function of prenatal tests, despite protestations to the contrary, is to provide parents the information necessary to assure that all pregnancies brought to term are “normal.” I worry not only about the encouragement given to eliminating a “whole category of persons” (the point you make), but also about the prospects for respect and treatment of children who come to be brain-damaged either through unexpected birth traumas or later accidents. And what about the pressures to which parents like myself will be subject? (How could you “choose” to burden society in this way?)
She’s right, exactly right, and that leads me to a major criticism of Berube’s book: He blinks when it comes to the questions put to me so eloquently by a mother grieving the loss of her wonderful child with Down syndrome .
WHAT ARE WE DOING IF NOT RESURRECTING THE CONVICTION THAT ALL SORTS OF CATEGORIES OF PERSONS SHOULD BE PHASED OUT
Berube’s text is in so many ways such an engaging account of Jamie, and the Berubes are such obviously wonderful parents that it may seem churlish to criticize. But Berube wanders over into philosophical and ethical turf, and it is here that he must be taken to task. Berube does put the question: “Would we have chosen to have the child if we had known?” But the way he puts this question dictates a conclusion: choice is the trump card in Berube’s civic and ethical lexicon. Would we have chosen? The choice—the right—the power over life and death is ours. And he isn’t so sure—though he and his wife knew they were taking a chance by not doing amniocentesis—whether they would have followed through on the pregnancy; more likely, they would not. He hastens to assure us that he and his wife “are as strongly pro-choice today as we were before James was born.” This means, in his words, that the state does not have “the right to override an individual woman’s jurisdiction over what happens in her body,” a formulation that he knows skews things one direction but that he finds less “toxic and coercive” than the question “whether a woman should have the right to kill an innocent unborn child.” For Berube, this latter is a cruel and unacceptable way to frame the problem; so whatever the troubles with woman’s sovereign jurisdiction, it is “infinitely” preferable .
Then follows the predictable brief against the stereotypical pro-lifer: a harsh moralist concerned only with “protecting the unborn,” all too eager to overlook Jamie and others. So fetuses have rights, but children with Down syndrome can rot, more or less, according to this mythical pro-lifer. Indeed, Berube’s cardboard cutout pro-life politician denies rights to living persons .
One wonders who does this. Who are these people? He calls the implications of holding that humans have a right to life “only until they’re born” staggering, and this would be true if anybody held to that view. But I can’t think of a single pro-lifer who does, certainly not to judge from the literature I received from a number of pro-life groups .
Berube cannot be thinking of the Catholic bishops who were second to none in opposing the big welfare reform bill because they believed it would harm children, first and foremost, and who favor national health care and most everything else Berube seems to favor. But all pro-lifers get represented—and remember representation is his big schtick—as Scrooges who believe anyone who isn’t fully self-sufficient after birth is a deadbeat .
These are pretty cheap shots, and that’s a shame because it means Berube can skirt the questions put to me by the mother who wrote the letter I quote from above. He does this by claiming that it is “fiscal austerity” combined with eugenics that presents the real danger, not woman’s “freedom.”
Yet he notes that 90 percent of couples who learn from amniocentesis that the child the woman is carrying will be born with Down syndrome “choose” to abort. This choice does not take place in a vacuum: there are broader cultural forces that dictate that imperfection is bad; that people cannot cope; that those who will be “burdens” to us would be better off left unborn or dispatched through Kevorkianlike methods at the end of life, on and on. Berube asks: “Why should our taxes go to support the infirm, the unable, the defective?” And he dreads a society that frames the question this way. But a society that already sets a framework for our “reproductive choices” in a manner that puts pressure to exercise that choice in one direction only is developing its own “soft” answer to Berube’s question. The tacit presupposition is: If individuals just choose the right way (and we know what that is), there will be fewer such folks to burden us .
To be fair, Berube does struggle with certain questions. And although he professes a rather astonishing agnosticism on when a fetus becomes “sufficiently babylike as to make abortion wrong,” he is opposed to third-trimester abortions: presumably by that point the fetus has met the “babylike” threshold. (I say “astonishing” because “babylikeness” kicks in much earlier if we are going with the presentation of a visual image as the definitive criterion. The fetus, of course, is human all along—what else can it be? )
Well, these are troubling matters, and Berube is troubled but not, I want to suggest, as troubled as he might be given his concern with cultural representations. Consider the representation of the Self as Sovereign Chooser and what that does to the moral universe and to those who are not such sovereign choosers, whether because they are in their nonage, or because they are “different” and cannot be such by definition, or because they are unborn. Berube recognizes that language shapes our “thoughts in material, indelible ways.” Indeed it does, and that is why the language he deploys to characterize the pros and cons of abortion needs further critical unpacking and dissection than he here gives it .
I would say to Berube: What happens if you probe your own moral squeamishness about third-trimester abortions? Where does that lead you?What do you learn if you keep going rather than throwing in the towel? When Berube queries, toward the end of this book during which Jamie comes alive to us: “So, dear reader, be you a chimney sweep or a chairman of the board, do you have any obligations to the Jamies in your midst? Why is it possible for us to believe that we may, and so easy for us to act as if we do not? Is it simply that we find it so easy to believe we will never face the prospect of caring for someone—child, parent, friend, countryman—with a disability that requires our help?”, my answer is, Yes, I do have such obligations. And yes, we do believe we will never be thus confronted. And, for that very reason, any set of societal norms, or framing of choices, or indeed absolutizing of choice and control in order to restrict the entry into our world of Jamies whose births are so easily preventable—that 90 percent of prospective Down parents who abort rather than adding another Jamie to our midst—is subtly but inexorably blowing out the moral lights among us, as Lincoln said of Douglas’s defense of popular sovereignty in the matter of slavery .
Of Coach Stallings’s “as told to” book, it might be said that it is so refreshingly straightforward and so blunt about the shock of a “handicapped child” and the putting of one’s shoulder to the wheel to tend to his care and then finding that one is wearing one’s heart on one’s sleeve as one grows to love that child—John, in this case—beyond measure, that it restores a certain confidence in human capacities and human decency .
Stallings, the former University of Alabama football coach, describes unstintingly social sitations when his son’s “retardation” was so much of an embarrassment to others that they couldn’t even bear to “notice” or to “mention” it. He and his wife were flooded with good advice, from physician to family to friend: Put him in an institution. That, says Stallings, was never an option. It is to the great credit of thousands of American parents, sometimes with the love and support of family, friends, and church, sometimes without, that they came more and more to say: That’s no option .
In a time and a place when “genetic impairment” is now routinized as grounds for abortion, when prenatal tests can be done for some 200 genetic disorders, and when parents are being frightened out of their wits as all sorts of “tests” come up with possibly dangerous information with the full panoply of science to back them up, we may be witnessing yet another move to restrict the human community once again. For what are we doing if not resurrecting—test by test, and all in the name of advancement and choice and sovereign freedom and science—the conviction that all sorts of categories of persons should be phased out? And we can pat ourselves on the back as enlightened, decent people the whole while .
Jean Bethke Elshtain is Laura Spelman Rockefeller Professor of Social and Political Ethics at the University of Chicago. Author of many works, most recently a collection of essays, Real Politics: At the Center of Everyday Life (Johns Hopkins University Press), she is the parent of an adult child with mental retardation, Sheri.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mailbceditor@BooksAndCulture.com.
- More fromJean Bethke Elshtain
Allen C. Guelzo
If consciousness is only an illusion, it’s the greatest mistake human beings have ever made.
- View Issue
- Subscribe
- Give a Gift
- Archives
I have written this book,” Gerald Edelman brazenly announces at the opening of Bright Air, Brilliant Fire: On the Matter of the Mind, “because I think its subject is the most important one imaginable.” Since his book is about the nature of human consciousness, that might be nothing more than cutely obvious. But Edelman is not playing obvious, and he is far from alone in believing that something has recently cracked and given in what used to be the wall of mystery surrounding consciousness. Building on a generation’s worth of studies of brain physiology and on the creation of computers in the last decade and a half sophisticated enough to simulate thinking, Edelman—together with Patricia and Paul Churchland, Daniel Dennett, John Searle, and Francis Crick, to name only the most well-known—have suddenly thrust onto center stage an unsettling series of solutions to the mystery of human self-awareness, our subjective experience of being alive and personal, of the divine spark, if you will.
These solutions are far from unanimous in their details, but they are all agreed on one very basic point: What we call “consciousness” is purely a material process. Consciousness is not the evidence of a “mind” substance as apart from “body” substance; still less is “consciousness” the activity of a spirit or soul inside our physical bodies. “We are at the beginning of the neuroscientific revolution,” Edelman buoyantly declares, “a prelude to the largest possible scientific revolution, one with inevitable and important social consequences.” Indeed we are, and while Christians are mostly consumed with opening yet newer rounds in their century-and-a-half-old war with Charles Darwin, they have scarcely the faintest idea that the new consciousness enthusiasm is by far the greater threat to the integrity of Christian belief.
What is peculiar about what Edelman calls “the neuroscientific revolution” is that it is really not a new business at all, but merely a long-deferred one. Three hundred years ago, the achievements of Galileo, Newton, and the Scientific Revolution knocked down all explanations of the physical universe to the operation of laws on material substances. They might have tried to reduce the inner world of human experience to the same level if the brain had been as easily observable as the orbit of the moon. But that, as RenŽ Descartes delighted in showing in his Meditations on First Philosophy in 1641, was not the case, a difficulty that allowed Descartes to cut one of the greatest deals in Western philosophy. In exchange for conceding that the world outside the human consciousness was nothing but material substance (and therefore the proper domain of the scientists), Descartes insisted on keeping the subjective world of the consciousness as the location of spiritual substance, or the soul. It was, in effect, the first great land-for-peace swap: the scientists would be allowed to reduce everything outside the mind to simple physical laws and material substance provided they acknowledged that personal consciousness was the product of an entirely different kind of spiritual substance that obeyed spiritual and moral laws and provided direct contact with God.
And this was not, on the whole, a bad bargain, either. The scientists had more than enough to explore in the outer world to keep them occupied for a couple hundred years, and the theologians could be content that, whatever might be true in the physical world, the irreducibility of the mind to material substance was proof of the existence of the soul, and beyond that, of God.
This zoning-off of the mind from the scientists was helped by the fact that human consciousness really did turn out to be a difficult subject to get under scientific observation. Even defining consciousness is not easy since our own consciousness is the most obvious, direct, and familiar thing we deal with every day, but also the hardest to analyze and report upon. To study one’s own consciousness is like trying to be conscious of one’s consciousness: how can you step back and look at the very thing which permits you to step back and look in the first place? Not only is it difficult to be objective about one’s own consciousness, it is impossible to simulate someone else’s. Objectively, we can all recognize the sharpness of a thorn, but only the person who is pricked by it feels pain.
There were a persistent few who kept picking at the problem, as Israel Rosenfield shows in The Strange, Familiar, and Forgotten: An Anatomy of Consciousness, but almost all of them came at it as critics of Descartes, eager to reduce consciousness to a physiological shadow of the brain and get rid of the last toehold of spiritual substance. Julien de La Mettrie in 1747 asserted that thought and consciousness were no evidence of spiritual substance but were only properties or functions of brain matter. The pioneer German neurologist Franz Gall linked certain kinds of thought to specific physical areas of the brain, and in 1861 Paul Broca staged a dramatic public demonstration of how damage to a particular area of the brain’s left hemisphere (now known as Broca’s Area) rendered certain kinds of speech impossible.
But neither Broca nor Gall attracted much interest outside of their specialties. Popular attention was riveted instead on Sigmund Freud’s pursuits of the mind’s pathologies, which led him and most of this century’s students of the mind away from the study of consciousness and into the more dubious realms of the unconscious.
In the United States, the popular dominance of pragmatism in American philosophy also diverted interest away from study of the mind and into ways of understanding and manipulating behavior. Neither the Freudians nor the behaviorists were particularly friendly to any notions of a soul, but at least none of them spent much time trying to prove that it didn’t exist.
This began to change after World War II, and one can almost pinpoint the moment when consciousness once again became a direct scientific target: the conceptualization by Alan Turing of the basic model of the computer and John von Neumann’s conclusion that the computations performed by complex, integrated computers are like the functions of the brain. Hence, the brain should be understood, not as the residence of the soul, but as the hardware of a computational device. The proof of this, which became known as the Turing test, was maniacally simple: Any logical function, mathematical or otherwise, can be performed on a Turing machine; complex logical functions merely require the development of more complex Turing machines to copy them artificially; eventually, a universal Turing machine will be able to perform all the logical functions of a human being, and in such a way that an observer will not be able to distinguish between the work done by the human being and the work done by the computer. At that point, the computer will have achieved the same mind state as the human being; or, to put it another way, we will discover that human consciousness is nothing different from the high-level operations of a Turing machine.
This opened a direct route toward creating computers so sophisticated that they could beat grandmasters at chess. What was less noticeable at first was that this also opened the direct route to overthrowing Descartes’ dualism and demonstrating that consciousness, instead of being the proof of spiritual substance in human beings, is only the by-product of computation—at best, the software of a mental Turing machine.
Although proposals for equating brains and computers first surfaced in the 1940s, the real starting date for what Edelman calls the “neuroscientific revolution” is 1986 and the publication of Patricia Churchland’s Neurophilosophy: Toward a Unified Science of the Mind/Brain. Churchland’s book promised to “change profoundly” not only our understanding of the working of the brain but “therewith our epistemology.” Even though subsequent work on the brain squeezed Churchland to concede (in The Computational Brain, which she wrote with Terry Sejnowski in 1992) that the brain was a good deal more complicated than an ordinary computer, it remained her basic contention that consciousness was really only computation and that “psychological processes are in fact processes of the physical brain, not, as Descartes concluded, processes of a nonphysical soul or mind.”
That kind of blunt combativeness earned Churchland an immediate sit-up audience. But not even Churchland could match the free-wheeling feistiness of Daniel Dennett of mit, where the nation’s most advanced center for studies of artificial computer-modeled intelligence was headquartered in the early 1980s. Dennett might be better known for the pristine Darwinian fundamentalism he defended in his best-selling Darwin’s Dangerous Idea: Evolution and the Meanings of Life (1995), but he has actually devoted most of his career as a philosopher to problems of the brain and knowledge. “My first year in college, I read Descartes’ Meditations and was hooked on the mind-body problem,” Dennett wrote in Consciousness Explained in 1991. “How on earth could my thoughts and feelings fit in the same world with the nerve cells and molecules that made up my brain?” The answer for Dennett was found in computation: computers are brains, virtually, and the process of computation is a computer’s version of consciousness. At the same time, what we experience as consciousness is a virtual equivalent of the brain’s performance as a computer. “Anyone or anything that has such a virtual machine as its control system is conscious in the fullest sense, and is conscious because it has such a virtual machine.”
What Pat Churchland likes to state as an unadorned assertion, Dennett clothes in rhetorical provocation, and there is no philosopher writing in America today who has Dennett’s gift for the jewellike explanation or the wickedly well-timed argument. He jeers at the notion that consciousness is a resident or a location in the brain—that it is a “Cartesian Theater” where information from the material world is assembled and evaluated for thought and action. Instead, the brain operates something like a word processor, creating what Dennett calls “multiple drafts” in which good ideas or “good tricks” survive to produce design enhancements in the larger organism. Draft for draft, “conscious human minds are more-or-less serial virtual machines.”
In that respect, not only is consciousness not a Cartesian Theater, but there is no subjectivity, no intentionality, and in fact nothing that we usually call consciousness in the brain at all. The only thing that can be called consciousness is the brain’s program, and it “can best be understood as the operation of a ‘von Neumannesque’ virtual machine.” In effect, Dennett’s solution to any dualism of consciousness and brain is to eliminate consciousness, or at least eliminate it as anything more genuinely subjective than a computer program. This, as Dennett well knew, would draw immediate fire from critics who believed that Dennett was bluffing, and that he would back down the moment it was pointed out that this would reduce human beings to the equivalent of zombies. He never blinked. “We’re all zombies,” he announced—highly complex zombies, of course, but more near kin to zombies than to Descartes’ dualist composite of material and spiritual. What we imagine to be the activity of a unique nonmaterial substance inside us (or inside our heads) that produces our consciousness is really a mistake, a sort of primitive folk-psychology belief like the flat Earth.
The problem is that, if consciousness really is an illusion, it is the greatest mistake human beings have ever made— a mistake so colossal that many of the neuroscientists and neurophilosophers in the consciousness camp who otherwise share Churchland’s and Dennett’s eagerness to reduce mind to matter are openly reluctant to make the computer the matter it gets reduced to. John Searle, who yields nothing to Dennett in combativeness, dismisses computational models of consciousness as ridiculous.
Suppose (asks Searle) I find myself in a locked room, into which questions in the form of Chinese characters are being fed; I am equipped with a handbook which tells me that, whenever a certain character is fed into the room, I am to feed a certain other one out as the answer to the question. Much as this might actually offer a fair imitation of conversational Chinese, I never actually understand Chinese. All I do is perform the necessary algorithm for processing Chinese characters. This “Chinese Room,” Searle insists, is a fairly good model for how a Turing machine operates, but it also underscores what a computer can never do: it can never understand Chinese. The computationalists have mistaken syntax (an algorithmic process computers can perform very well) for semantics, which only conscious humans can experience.
Searle’s “Chinese Room” was never intended as a defense of Descartes or of the soul. In the broadest sense, Searle actually agrees with Dennett and Churchland that “mental phenomena are caused by neurophysiological processes in the brain and are themselves features of the brain.” It’s simply that, for Searle, consciousness does not function like computation. Consciousness involves properties—such as intentionality, time, distinguishing the self from what is nonself—which have no corresponding features in a computer.
But putting the argument this way still leaves Searle open to the charge that he is really a closet dualist after all. Even if our consciousness is not the representative of a substance different from our bodies, it still involves properties that distinguish it from every other material substance in our bodies, and so we still get a wall of separation between mind and body, consciousness and brain. Searle protests that this is not at all what he wants, that consciousness should be regarded as a by-product of brain activity just like “growth, digestion, or the secretion of bile.” No use: for Searle’s critics (and over a hundred attacks on the “Chinese Room” argument have appeared since Searle first published it), any division of consciousness from the brain is dualism, and dualism is the enemy of all properly credentialed science.
Maybe conciousness just happens and cannot be reduced to either physics or biolgy. Maybe consciousness “might” indeed require some kind of act of God.”
In his less antagonistic moments, Searle is willing to see the computationalists as an amiable but overenthusiastic spinoff of interest in artificial intelligence. But as a materialist himself, he resents the tendency of Churchland and Dennett to cast all noncomputationalists as secret Cartesians unless they are willing to deny the existence of consciousness as a unique state and embrace the computer. “Earlier materialists argued that there aren’t any such things as separate mental phenomena, because mental phenomena are identical with brain states,” Searle complained in The Rediscovery of the Mind in 1992, but what Dennett and Churchland want to argue is “that there aren’t any such things as separate mental phenomena” in the first place. Computationalism, which he describes as “eliminative materialism” or “strong artificial intelligence” (or simply “Strong ai”), is simply “the view that mental states don’t exist at all.”
The same skepticism about computationalism flavors three of the most important observers of the consciousness revival besides Searle—Francis Crick, Gerald Edelman, and Roger Penrose. Crick came to consciousness studies through his interest in vision, and his name recognition as the codiscoverer of the DNA double helix may be the greatest publicity asset that consciousness studies have. Or maybe not: His 1994 book on consciousness, with the sensational title The Astonishing Hypothesis: The Scientific Search for the Soul, belied the pretensions of its title on the very first page when Crick explained that “I do not suggest a crisp solution to the problem.” Indeed he does not. His “astonishing hypothesis” is simply that “our identities are nothing more than an assembly of nerve cells,” a hypothesis that ceased to be astonishing quite some time ago. What is genuinely astonishing in Crick’s book is Crick’s own naivte. He is certain (as Searle cautiously is not) that consciousness arises in the thalamus at moments when the firing of neurons achieves a certain speed and rhythm, and that mental phenomena like free will can be neatly “located in or near the anterior cingulate sulcus.”
But if Crick is amateurishly simplistic in his eagerness, he still shares with Searle two basic propositions: that there is no soul or spiritual substance that underlies or constitutes consciousness (“the idea that man has a disembodied soul is as unnecessary as the old idea that there was a Life Force”) and that consciousness is a product of biological materialism, not computational materialism. (“A brain does not look even a little bit like a general-purpose computer,” Crick snorts. “Faced with tasks that ordinary humans can do in a rapid and effortless way, such as seeing objects and understanding their significance, even the most modern computers fail.”)
Gerald Edelman’s Bright Air, Brilliant Fire and The Remembered Present: A Biological Theory of Consciousness have none of Crick’s credulity while still underscoring the same point about the biological sources of consciousness. Edelman shies away from Crick’s cheerful confidence that he can pin consciousness down to certain select neurons, preferring to speak of consciousness as a dynamic phenomenon, a relationship between “neuronal maps” in the brain or a conversation between differing orders of neurons. Edelman is also just as strongly convinced of the folly of computationalism. The brain “is not a computer and the world is not a piece of computer tape,” and computationalism is a piece of “silliness” that has arisen “from the analogy between thinking and logic.” Edelman has, moreover, a mathematician’s argument to throw back at computationalism, Kurt Godel’s “incompleteness” theorem, which establishes that no system of algorithms is sufficient of itself to prove its own truth. If minds were computers and consciousness were computation, then by Godel’s theorem the system could never be aware of itself—but self-awareness is the pith of consciousness if ever anything was. Hence, minds cannot be like computers.
Edelman’s portrait of consciousness as a kind of discourse seems to have a particularly strong appeal for philosophers and historians, for whom discourse is their daily bread. Israel Rosenfield, trained as a physician but a professor of history at the City University of New York, insists that consciousness “has to be relational” and that it “is this relation that creates a sense of self.” Rosenfield is critical, not only of the computationalists, but also of the quick-trigger materialism of many neurologists and psychologists going back to Gall and Broca who believe that damage to specific material locations in the brain has to result in specific alterations in consciousness. Understanding consciousness as a relation means, for Rosenfield, that it is not possible to sustain brain damage in one place in the brain “without profound alterations in the entire structure of an individual’s knowledge.” Fred Dretske, chair of the philosophy department at Stanford, also makes a case (in Naturalizing the Mind) for consciousness as a relationship, not so much between physical points in the brain as between the brain’s representations, which he insists are just as material and “objectively determinable as are the biological functions of bodily organs.”
But biological explanations of consciousness—even when redefined as neuronal or representational relations— still call for the denial of a great deal of what intuition (or “common sense” or whatever) tells us about the uniqueness and subjectivity of consciousness. Such denial does not come readily to Roger Penrose, who is a mathematician rather than a biologist like Crick or Edelman, and who finds that Godel’s theorem does more than merely discomfit the computationalists. In Shadows of the Mind: A Search for the Missing Science of Consciousness, Penrose admonishes his consciousness colleagues that Godel’s theorem demonstrates that “human insight lies” not only “beyond computable procedures” but “beyond formal argument” as well. Biology, as much as computers, will fail to describe consciousness because consciousness, in all likelihood, operates in ways that transcend both.
Penrose believes that, if an answer to consciousness is likely to come from any source, it will only come in the future and from the development of quantum mechanics. “The physics of ordinary matter seems, at first sight at least, to allow no room for … non-computable behaviour,” but Penrose is confident that “it is only the arrogance of our present age that leads so many to believe that we now know all the basic principles that can underlie all the subtleties of biological action.” Consciousness is too complex to explain in any terms less than a new physics.
But whether it is biology or physics that does the explaining, all of the noncomputationalists still end up on at least this much common ground with Dennett and Churchland: consciousness is a natural process, a function of material substance; there is no soul, nor any other spiritual substance; and at death it all disappears. Searle, Crick, Edelman, and Penrose rescue us from computational zombiedom, but to what end? “With the death of each individual, that particular memory and consciousness is lost,” Gerald Edelman writes, with an evident tinge of wistfulness. “There is, as such, no individual immortality.”
One does not have to be a Christian to find ominous regions of fault in the new consciousness studies. A truculent band of dissenters within the ranks of the consciousness mavens, headed by Colin McGinn and nicknamed “the New Mysterians,” insists that the experience of consciousness is so subjective that there is no secure way of drawing connections between brain physiology and conscious states. McGinn, invoking a classic argument by Thomas Nagel on the impossibility of humans conceptualizing the cognitive experience of a bat, warns that consciousness studies cannot penetrate the nature of consciousness itself without a “radical conceptual innovation (which I have argued is probably beyond us).”
David Chalmers, writing in Scientific American in 1995, suggests that maybe consciousness just happens, and cannot be reduced to either physics or biology. Maybe, even Roger Penrose allows, consciousness “might indeed require some kind of act of God—and … cannot be explained in terms of that science which has become so successful in the description of the inanimate world.”
We have learned too much in the last half-century about imperialism to imagine that the imperialism of science will have any better results, and the voices that suggest at least looking before leaping over consciousness have a good deal of merit to them. That this caution is so quickly disregarded in the consciousness books rouses suspicion that there may be other agendas driving the popularity of consciousness studies.
It is curious that although the consciousness literature has the aura of “neuroscience” about it, a number of the consciousness books form a suggestive—and I don’t think altogether accidental—counterpart to the rage for literary theory in the 1990s. If the center of literary theory has been the uncertainty of narrative selves and the production of “selves” purely as performances or interpretations, then nothing comes closer to that in physiological terms than a consciousness that has no substantial integrity of its own. Perhaps it is no accident again that Descartes is the philosopher that literary theory loves to hate.
More troubling are the implications for ethics and action posed by the consciousness studies. Gerald Edelman, with what sounds like real anguish, warns that “under present machine models of the mind,” questions of ethics and morals become “a problem of major proportions, for under such models it is easy to reject a human being or to exploit a person as simply another machine.” If the last refuge of the soul turns out not to be a refuge at all—if even our consciousness is itself no more privileged in substance than our digestion—then no good reason exists apart from cultural opinion not to indulge whatever things power gives us the means to do to each other.
This turns out to be exactly what some of the consciousness studies recommend. Almost as if in a demonic echo of Edelman’s dilemma, Daniel Dennett cheerfully catalogues as “myths” such concerns as the “sanctity of life, or of consciousness.” Francis Crick, whose indifference to the clumsiness of an argument suffers nothing when transferred from neurology to ethics, seems genuinely surprised in The Astonishing Hypothesis that anyone should raise ethical questions about a desire to experiment on human brains (and why not, when consciousness is only a matter of rioting neurons?) and he quietly applauds one colleague who “wisely did not embark on his experiments on consciousness in alert people until he had obtained the security of academic tenure.”
Only a little less ominous is the sharply deterministic bent of virtually all of the consciousness literature. Both Crick and Dennett feel there is little point in talking about free will or moral responsibility as it has been usually understood; Penrose would like to defer “the profound issue (or the ‘illusion’?) of our free wills” for consideration in “the future”; Searle and Edelman are willing to espouse some form of free will, but even then, it is only to “some degree.” Clothing consciousness studies with ethical restraint, as Edelman puts it with substantial understatement, “is one of the largest challenges of our time.”
If so, it is remarkable for how little Christian thinkers have risen to meet it. In a review in these pages (Books & Culture, January/February 1996) of Roger Penrose and Patricia Churchland’s husband, Paul Churchland (a professor of philosophy at the University of California, San Diego, and an associate of the Salk Institute), William Hasker joined in the attack on dualism by suggesting that “Mind-body dualists, who think the mind or soul is not fundamentally dependent on the brain, owe us a plausible account of these functional dependencies—an account that, so far as I know, is not yet forthcoming.”
Well, if the soul isn’t in some very fundamental way independent of the brain, it would be even more important to have a plausible account of just what part of the soul is lost to death when the brain dies. Hasker’s bland and accommodating observation that “the view that the mind is somehow produced or generated by the brain is not in conflict with any essential Christian doctrine, including the belief in eternal life,” actually avoids an endorsement of materialism only by dangling on the qualifier somehow. And the suggestion by Hasker that such a reduction simply reminds us “more seriously than has generally been done the truth that we are created from the dust of the earth” misses entirely the even more serious reminder that God breathed into us an immortal soul, and that the two get confused at our peril.
But Hasker is a philosopher, and a little theological blandness can be forgiven. From the theologians there is not only deafening silence, but not even much recognition that a problem is being brewed under their noses. If we are being saved in both body and soul, hadn’t we better secure a reasonably good grip on what we mean by the soul if the very idea of salvation is to remain coherent? Where is the evangelical theologian writing on the soul?
One place where such an answer might begin is to question what virtually all of the consciousness mavens assume without much examination, and that is that Descartes was wrong and that dualism is bad. Dismissing Descartes is the most common opening move of nearly everyone in consciousness studies today, including both Dennett and Searle, but before we join them it would be a good idea to see where that dismissal takes them. Deprecations of Descartes almost always function as a means toward collapsing any useful distinction between spiritual and material substance and establishing some form of materialism, and that is a dangerous goal for Christians to applaud.
Books Discussed in this Essay
Patricia S. Churchland and Terence J. Sejnowski, The Computational Brain(MIT Press, 1992), 544 pp.; $47.50,hardcover; $19.95, paper.
Francis Crick, The Astonishing Hypothesis: The Scientific Search for the Soul(Scribners, 1994), 317 pp.; $25.
Daniel Dennett, Consciousness Explained(Little Brown, 1991), 511 pp.; $27.95.
Fred J. Dretske, Naturalizing the Mind(MIT Press, 1995), 208 pp.; $22.50, hardcover; $12.50, paper.
Gerald M. Edelman, Bright Air, Brilliant Fire: On the Matter of the Mind (Basic Books, 1989), 384 pp.; $35.
Roger Penrose, Shadows of the Mind: A Search for the Missing Science of Consciousness(Oxford University Press, 1994), 457 pp.; $25, hardcover; $16.95, paper.
Israel Rosenfield, The Strange, Familiar, and Forgotten: An Anatomy of Consciousness(Alfred A. Knopf, 1992), 157 pp.; $20.
John R. Searle, The Rediscovery of the Mind(MIT Press, 1992), 270 pp.; $30, hardcover; $14, paper.
I would like to ask, if it’s not too impertinent, just why Descartes, or at least mind-body dualism, should be so unspeakable. For one thing, it tends to come naturally, and that response should not be dismissed out of hand. As John Foster argues in The Immaterial Self: A Defence of the Cartesian Dualist Conception of the Mind (1991), “Our ordinary intuition is that, despite its attachment to an embodied subject, and despite its intimate causal dependence on the relevant neural processes, mentality cannot be reduced to non-mental factors.” Searle, in fact, concedes that “the man-in-the-street is a Cartesian,” and he likes to tell the story of going to hear a lecture by the Dalai Lama some years ago, only to find himself treated from that very unlikely source to a hearty discourse on mind-body dualism.
Dualism, admittedly, has its weaknesses. It can slide into skepticism about the reality of the external world; it lacks a good description of what guarantees that mind and body can interact; and it even creates difficulties in describing sleep. In Christian hands, dualism has often been pushed into various forms of Platonism, epiphenomenalism, and finally into the occasionalism so beloved of Nicholas Malebranche and Jonathan Edwards. But at least, as Foster points out, “the dualist gives a radically non-physicalist account of what exists or occurs within the mind: he takes sensations, thought-episodes, decisions, instances of belief, and so on, to be wholly non-physical—to be devoid of any intrinsic physical attributes or location in physical space.” That is not a bad way of accounting for the soul; and, for that matter, Malebranche and Edwards are not bad theological company.
One thing is certain, though. As Dennett writes, “This is a glorious time to be involved in research on the mind.” At this moment, he rejoices, “the frontier of research on the mind is so wide open that there is almost no settled wisdom about what the right questions and methods are.” It will be strange, considering what is at stake, if Christians seize no part of this glorious time for their own. “Now is the time to take the problem of consciousness seriously,” Francis Crick declares, and I could not agree more. What, after all, does it profit a man to gain the whole world, and forfeit his soul?
Allen C. Guelzo is Grace F. Kea Professor of American History at Eastern College.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromAllen C. Guelzo
Timothy C. Morgan
When technology functions as a religion, as savior and liberator, we begin to project divine attributes onto it.
- View Issue
- Subscribe
- Give a Gift
- Archives
Whenever I come across new prognostications about how the Internet, cyberspace, or virtual reality will bring about either a new age of prosperity, or a millennium of evil and despair, I enjoy reaching for a favorite book, The Experts Speak: The Definitive Compendium of Authoritative Misinformation (Pantheon, 1984).
A joint project of The Nation magazine and the Institute of Expertology, this book has a chapter on Homo Faber (Man the toolmaker) and the unstoppable march of technology, including such gems as these:
- “I think there is a world market for about five computers,” a remark made in 1943, attributed to Thomas J. Watson, the late chairman of ibm.
- “[A] few decades hence, energy may be free—just like the unmetered air,” John von Neumann, the Fermi Award-winning mathematician and cofounder of game theory, in 1956.
- “There is no reason for any individual to have a computer in their home,” Ken Olsen, founder of Digital Equipment Corporation, in 1977.
The technoprophecies of the 1990s, often either deeply pessimistic or blithely utopian, view the development of cyberspace as if it were a newly discovered continent, full of delights to be exploited and dangers to be sidestepped.
In much the same way as explorers of an earlier era did, the cybernauts of this new world, courtesy of more than 120 million computers linked worldwide, are bringing all the great religions of the world along with them. Roman Catholics, Protestants, Jews, Muslims, Hindus, Buddhists, and thousands of other religious groups have staked out territory in cyberspace. For the uninitiated Christian, the Christian Cyberspace Companion, by Jason Baker, furnishes an excellent introduction to the Internet and provides a valuable appendix of key religion resources.
Now that religion and the cyberfaithful are online, what do they aim to accomplish? How is cyberspace changing the rules and practice of religion? What are the tradeoffs?
Answering such questions is dependent on understanding as best we can how the Internet is altering our environment for better and worse. The experts and explorers may often be wrong in their predictions—not only because the pace of technological development is so rapid, but also because it is a remarkably human process, subject to unpredictable twists and turns.
INTERNET AND RELIGION
An insightful commentary on cyber-pathologies, Virtual Gods, edited by Tal Brooke, in part advances a critique of the Internet through citing ideas from scholar and social critic Jacques Ellul: that technological progress “has its price,” that it raises more and greater problems than it solves, that technology’s benefits are inseparable from its destructive effects, and that every major technological innovation leads to many unforeseen consequences.
While we need these cautions against an uncritical faith in “progress,” there is a difference between healthy skepticism and a demonizing of technology. A reader of Virtual Gods may be left with the impression that the use of technology is invariably a Faustian bargain, that it inevitably dehumanizes us and ultimately costs us our souls.
When technology functions as a religion, as savior and liberator we begin to project divine attributes onto it
Certainly no human contraption, high-tech, low-tech, or no-tech, should ever be called neutral or impartial. In this decade, the Internet has repositioned itself at least three times, going from a strategic defense program, to the university, and finally to the open commercial marketplace on a worldwide scale. If you look in detail at how the Internet has developed, its technology has progressed to give its users a competitive advantage. Commercial junk e-mail, for example, was a nonexistent problem two years ago. Today it has become a huge issue for anyone with an e-mail address. It’s a new way of selling goods and services, giving vendors a leg up on their competition.
Among religions, a competitive advantage is pursued vigorously—if not for converts, then for public opinion and recognition. Although technophiles who embrace the Internet proclaim how it makes us better, I most often observe how little cyberspace has altered our behavior or core identities in any fundamental way. Jeff Zaleski’s The Soul of Cyberspace chronicles how Sufi mystics, Roman Catholics, Lubavitchers, and other religious groups function online. They use the World Wide Web and other Internet areas to display an electronic self-portrait.
But what durable value is there in being online? Internet technology accelerates the rate of cultural change, intensifies new opportunities, and leverages one’s grasp to global proportions. Computer technology allows us to do things faster, cheaper, and with greater precision. But for some, that is not enough. Author Zaleski’s persistent question on whether prana, the Hindu term for life force, is transmittable via computers suggests that the supreme achievement for technology would be a computer achieving humanlike consciousness.
INTERNET AS RELIGION
A handful of leading postmodernists, New Agers, cyber-utopians, and others aspire to a future in which the Internet would function as a godlike force: omnipresent, omniscient, and omnipotent.
While most of us consider computers as nifty tools to help us do our jobs better, this emerging movement has goals that seem straight out of a Star Trek episode:
- Developing computer systems to accept an upload of a dying individual’s intelligence and personality.
- Creating a worldwide-linked computer system that would operate as a global mind.
- Reproducing computer-generated virtual realities that would operate with the same legitimacy as everyday reality.
If you set aside the scientific improbability of achieving such goals, these aspirations reveal a deeply spiritual agenda. When technology functions as a religion, as savior and liberator, we begin to project divine attributes onto it. We long for a way to take the sting out of death, for a connection to something larger and wiser than ourselves, and for a way to control and change our environments at will.
These desires expose the great risk of overreliance on technological solutions to the exclusion of other means. The force and power of technological progress has a multiplying, accumulating effect to the point where it may become the dominant influence on our lives. And within such dominance lies its natural limitations. Less than 10 years after the beginning of widespread availablity of Internet access, there are books on infoglut, cyberaddictions, and other maladjustments.
In The Soul in Cyberspace, Douglas Groothuis, assistant professor of religion and ethics at Denver Seminary, probes the extremist fringe of cyberideology from an evangelical perspective. A wild assortment of radical libertarians, antihumanist philosophers, and others with diverse and often contradictory agendas have seen in this new technology the fulfillment of their dreams. The Internet, true to its malleable character, becomes putty in their ideological hands. You would be hard-pressed to find a more alarming yet responsible survey of the ‘Net’s dark side than Groothuis’s book.
There is ample reason for these concerns. The impact of pornography, for example, has been dramatically increased by the Internet. A recent newspaper article about the World Wide Web was headlined “News and Nudes.” It detailed how the Wall Street Journal and Playboy magazine are among the few organizations well positioned to make a profit on the Web.
Nevertheless, it is a mistake to define the Internet solely by its excesses or perversions, as some Christians—and evangelicals in particular—seem to be in danger of doing. If you look at how people routinely interact with cyberspace technology, the Internet is essentially a communications medium. It is a means, not an end; a vehicle, not a destination. As a literary metaphor, cyberspace’s power comes from its ability to reshape our imaginations. Most of the worries about neognostics and postmodernists in cyberspace are overstated. The Internet is a technology of ideological convenience and is so plastic in its application that it mirrors more than remolds its users.
The Soul of Cyberspaceby Jeff ZaleskiHarperSanFrancisco284 pp.; $22
Christian Cyberspace Companion: A Guide to the Internet and Christian Online Resourcesby Jason D. BakerBaker Book House2nd ed., 250 pp.;$15.99, paper
Virtual GodsEdited by Tal BrookeHarvest House221 pp.; $10.99, paper
The Soul in CyberspaceBy Douglas GroothuisBaker Book House192 pp.; $9.99, paper
That is precisely the point made by Columbia University’s Jaron Lanier, the computer pioneer who developed the virtual reality interface glove. He explains, in an interview in The Soul of Cyberspace:
The Internet is a giant mirror being held up to mankind that we’ve never had before. It reflects all of our flaws and our embarrassments as well as our best qualities. It’s an honest mirror. So there’s a lot of dreadful stuff, but it’s really us. It’s really who we are.
WHAT HAVE-NOTS HAVE
I live and work in suburban Chicago. For my work as a journalist, I have four e-mail addresses, a Web site at my disposal, telephone, voice mail, fax, two computers on my desk, and two at home. Many of my coworkers are in a similar technology-rich environment.
I have an Anglican missionary friend who, until the recent military coup, was stationed in Zaire, now known as the Democratic Republic of Congo. She had nothing except a telephone and an electronic typewriter—and not always the electricity to use them. Her situation is not uncommon for the 740 million people living on the African continent, probably the least-wired region on the planet.
One of the most troublesome aspects of the Internet is how poorly its benefits are distributed. By and large, well-educated, English-speaking men in developed countries dominate cyberspace technology. Although an e-mail address and Internet service provider does bring the world to your desktop, there are billions of people in Asia, Africa, and other parts of the developing world who have no hope of gaining significant access to the Internet in the foreseeable future.
Traditionally, religious people would see this lack of access as an issue of economic and social justice. There are trickle-down ministry efforts that pass along computers and software to the disadvantaged. The predominant paradigm for viewing people who do not have technology at their fingertips is that they are poor, and we, the cybertech generation, are rich. Granted, there is an objective scale of rich and poor, which covers the spectrum from absolute poverty (no food, shelter, or clothing) to the fabulously wealthy. But there are other measures.
What I have discovered in visits to rural African villages in Uganda is that the technologically deprived, the have-nots of the modern era, have a quality and quantity of human interaction that we technophiles have lost. (Please keep in mind that I am referring almost exclusively to communications technology like the Internet, not medical or agricultural technology.)
Our high-tech society increasingly interposes a machine interface between people. Parents with children in daycare in some American cities now even have the option of using Internet video to monitor their children and caregivers. Such interaction could hardly be satisfying, but parents are choosing to use the Internet in this way when they lack any ready alternative.
FAITH FORMATION AND CYBERCULTURE
Over the next several years, the growth rate of cyberspace will slow somewhat, but the Internet will continue to expand. It will become more commercial, more like television, and easier to use. The nuisance of junk e-mail, the proliferation of cyberporn, threats to personal privacy: all these will increase as well, not to mention as-yet unanticipated side effects. The ‘Net is a work in progress. Author Jason Baker poses the challenge to Christians: “If we retreat from our call to be salt and light, then the foreseeable future of cyberspace could be much like that of television.”
So is it the fate of the Internet to become the cyberwasteland of the twenty-first century? Christians who aim at penetrating the Internet and striving for dominance are just as capable of laying waste to cyberspace as any predatory technocapitalist. Many evangelicals are eager to use technology to speed up the process of spiritual formation, extend the outreach of evangelistic efforts, and enhance our means of ministry to needy people. But these efforts must not interfere with the biblical reality that God mediates his relationship with us through the living Christ and his Holy Spirit, and not a cathode-ray tube, Qwerty keyboard, and a Pentium II processor.
Timothy C. Morgan is associate editor of Christianity Today magazine and head of CT’s news department.
Copyright © 1997 by the author or Christianity Today/Books & Culture Magazine. For reprint information call 630-260-6200 or e-mail bceditor@BooksAndCulture.com.
- More fromTimothy C. Morgan