Tuesday, March 31, 2009

Confessions of a Contrarian

A recent article in this past Sunday's New York Times captured my interest and reminded me about something I rarely own up to: I am a contrarian.

The article profiled Freeman Dyson (great first name), the most celebrated of today's small group of scientists who do not accept the current dogma on Global Warming. Dyson is not, as might be imagined, one to pooh-pooh concern for the environment. On the contrary, he is passionately dedicated to responsible use of the environment. Nor is he a crackpot. He is, in fact, one of the most accomplished scientists of his generation. He is, however, inclined to pooh-pooh science used inappropriately, even in support of a good cause.

Global Warming could be all that those who postulate its presence claim it to be. It could be entirely our fault (the result of excessive CO2 in the upper atmosphere, the result, in turn, of our excessive use of hydrocarbon fuels) and the greatest threat yet to our continued existence on this planet.

Alternatively it could be a significant threat that will surely change the way we live and upset the current environmental balance, thus significantly altering things — all things — leaving nothing as we know it now. Disruptive, to be sure, but not catastrophically destructive.

Or it could be a primarily natural occurrence — a continuation of a cycle of warming and cooling that the earth has experienced for far longer than we've had the ability to record history, one which we contribute to by burning hydrocarbons, but one that would happen in any case, no matter what we did.

Or, what we perceive as global warming could be a temporary, and relatively benign, wobble in a much longer warming/cooling cycle, a wobble that poses no serious or imminent threat.

Reasonable cases can be made for each position. Cases that account for the data we possess. (Keep in mind that the scientific claims on which global warming hinges involve a one degree difference in the average annual temperature recorded on Earth.)

Dyson pooh-poohs the idea that we can confidently extrapolate from meteorological evidence recorded over a tiny slice of geological time (one source says since 1847) sufficient evidence to support beyond a reasonable doubt any of these theories. He believes that much of the dire claims and warnings Al Gore recorded in his celebrated film are, for all he knows, just so much pooh-pooh. Gore could be right but he could just as easily be wrong. Dyson's basic contention (one I share) is that we just don't know. We simply don't have sufficient data.

Dyson performs the entirely necessary if disruptive service of the contrarian, loudly proclaiming, as a means of pursuing much needed balance, some heresy in the face of the rising tide of scientific certitude.

Personally, I have no problem with people believing that continuing to drive their gas-guzzling cars and denude the world of its forests will kill them. Fear is a pretty strong motivation. So if that's what it takes to motivate people to change, then so be it. We need to stop doing both (and a lot of other foolish things), and the sooner the better, for lots of good reasons (the fact that pollution is demonstrably bad for your health and is, therefore, killing you, for one) that have nothing to do with global warming.

But I don't think people need to believe a lie to do the right thing. And peddling a half-truth or a suspected truth as absolute truth may get things done in the short term, but I think that history testifies loud and clear that a lie, even in support of truth, ultimately tarnishes the truth. Truth that is not entirely true — that is mere supposition or assumption masquerading as truth, is often the worst kind of lie.

More power, then, to contrarians everywhere, who dare to question the received wisdom, sometimes at great cost, when "experts" get so sure of themselves that they no longer ask the questions that got them where they are.

Saturday, March 21, 2009

The Absurdity of the "Quality of Life" Debate

This article from BBC News, in which a medical ethicist weighs in on a U.K "right to life" case under adjudication in the British courts, reminds us of the essential futility of making decisions in such cases based on so-called "quality of life."

The case concerns a small child afflicted with a terminal disease. The doctors were suing for the right to discontinue treatment and withdraw life support because they believed the child is subject to "intolerable suffering." The father, a Muslim, believes that the right to take a life belongs to God, not humankind, and the mother contends that, in any case, the child's life is not without its compensatory pleasures. The court, for the moment, has sided with the parents. But the case, whichever way it ultimately falls out, will further reinforce a legal precedent that is dangerous, misguided and patently immoral.

The crux of the issue is the impossibility of adequately defining life's quality. In an age when the requirements of the simple business contract can be argued in court for decades, the very idea that a satisfactory legal definition for what constitutes "quality of life" could ever be forged is absurd. The medical ethicist in this case admits as much when he says (italics mine):
Intolerable suffering is not an objective criterion. Suffering, like pleasure, is a purely subjective experience and there exists no scientific instrument that shows exactly how much an individual is suffering.
In that case, how is it, then, that we continue to pursue such a definition? In his very next sentence, the ethicist finds what he thinks is a partial answer, noting that the only way to know for sure whether a person's suffering is "intolerable" is to ask him/her, which, in the current case, is not possible.

The problem with that, of course, is that I have experienced what I judged (at the time) to be intolerable suffering. And I know many others who have as well. By this man's definition, people afflicted with chronic depression, for example, could tell us at a point of pain, that we need to help him/her end that painful life, and we'd be bound to do it. (That, of course, is the position of the so-called "Right to Die" lobby.) But of course, the world is full of people who are glad that they didn't drive off that cliff, take those pills, pull the trigger or otherwise initiate the end of their own life but instead were prevented form doing so by caring family and/or friends or, on their own, grasped hold of their will to live and let it pull them out of intolerable suffering.

We are no better judges of what's best for us when we're in pain than anyone else would be. And this line of argument has no bearing anyway on the rights of those, both born and unborn, who cannot yet express themselves.

"Quality of life" is inherently one of the slipperiest of ethical slippery slopes, and one down which a society increasingly divorced from God or absolutes of any kind is doomed to go. The medical ethicist in this article, in fact, recognizes the vast, uncrossable gulf between the doctor/scientist, who only deals with what he can see and the parent, particularly the religious parent, who taps into what cannot be seen. He even admits that no one can fault the parents in question:

For the parents, these pleasures are sufficient to constitute a worthwhile life. Based on these beliefs, their decision to fight for their son's ongoing treatment is understandable. Indeed, we would be deeply concerned if anyone with these beliefs willingly allowed their child to die.

Indeed. But then he goes on to make this astonishing statement:

Although commentators have expressed much sympathy for the parents, they have generally overlooked the moral challenges for the medical team. In the doctor's eyes, by continuing to treat Baby MB with painful and futile measures, they are treating a vulnerable child against his best interests and violating a basic tenet of medical practice: first, do no harm Ironically, in these spacial circumstances, it is keeping the child alive that constitutes the harm.
Really!??! Since the greatest harm they could do (particularly from the point of view of the scientist who holds no belief in an afterlife) is to end its existence, it is arguably the lesser evil to treat him. Sorry, that seems to me to be pretty simple. In fact, it is foundational to every legal system that those who end the life of an innocent, and the act was premeditated, have committed murder. But our ethicist persists:

The child's neurologist, Dr S, said: "I have been feeling that what I have been doing as a doctor has been wrong for many months, which is a very difficult position for me to be in." The wrongness lies not only in acting against his conscience (which is distressing enough), but in being complicit in a child's profound and avoidable suffering. It is no surprise that some of the doctors have expressed a reluctance to carry on treating Baby MB if the ruling goes against them — which it now has.

Well, there you have it: This is a case of the medical community "feeling" like its in a "difficult" position and therefore, insisting on its right to relief from its own suffering. And, he suggests, the doctors are not sure they're willing to comply with the court's judgment, despite the fact that the court has ruled against them in accordance with the "quality of life" criteria they claim to live by.

If today's medical ethicists have their way, the world will eventually be robbed of the greatness, even the genius, that is wrought, in part, by people who fight intolerable suffering and handicap and survive to contribute much to the world's more fortunate and less pain stricken. What would the world be without a Steven Hawking? Or, to turn it around, a Mother Theresa, who believed that loving and comforting and valuing those in pain made more sense than to kill them. Who in fact, gave up what could have been a nice life like yours or mine to devote herself to those in intolerable pain?

The answer is, it would be a world in which the weakest, smallest and most vulnerable would be done away with by the powerful who, as a consequence of their own weakness, smallness and vulnerability, would presume to deteremine another's ultimate value.

Hasn't the world had enough of that already?

Saturday, March 07, 2009

Watching Watchmen

"Who is watching the Watchmen?"

The answer this weekend is lots of people. A much anticipated film adaptation of this now legendary graphic novel premiered in thousands of theaters Friday. Hotly debated in the entertainment press even before the first trailers appeared, the film was declared a sure failure by purists (called fanboys), disowned by its author and doubted by critics who consider the novel's dense, flashback-laden, multilayered story-within-a-story structure and fantastic imagery unfilmable. Indeed, the film's director, Zack Taylor, had been preceded by many who attempted then abandoned Watchmen film projects.

An unashamed fanboy himself, Taylor spent much time during the film's post-production period explaining and defending his vision of the book as film, reassuring fans that he would be faithful to the original. People went to the theater either in fearful anticipation, hoping for the best, or out of morbid curiosity, unwilling to pass up the chance to discuss a good train wreck over a latte.

How I came to be among the legion of Watchmen watchers Friday night and write this review deserves some explanation. Let me first say that I come late to the party. Until this past Christmas, I had never heard of Watchmen and had only the sketchiest notion of what a graphic novel was (a glorified comic book, right?). But I had determined to get my younger son, who requested only video games for gifts this year, at least one book. At the local book store, the yellow cover and its blood-spattered "happy face" badge caught my eye, and I just had to look. I didn't buy it right then, but did do some research. Turns out I had had in my hands what more than one reviewer called "the most celebrated graphic novel of all time," one that no less than Time magazine had named to its list of 100 Greatest Novels written in the past century. Well. So ... I took a chance.

Written by semi-reclusive Alan Moore, a self-described anarchist and comic book industry demi-god, Watchmen is considered his and that industry's masterpiece. In it, Moore, a Briton, creates a parallel universe version of the U.S. in 1985, in which Richard Nixon was not dethroned by Watergate, we won the Vietnam War, the comic book heroes have character flaws of the sort usually reported in supermarket tabloids, the still-raging Cold War is threatening to get nuclear hot, and a government experiment gone wrong has created a neon-blue superhuman who sees the future and could save the world or destroy it.

Moore envisions America gone mad for crime, sex and drugs after the second generation of a vigilante crime-fighting group formed in the 1940s to clean up America is forced to disband in the 1970s. His unmasked and decaped crew includes the Nite Owl (who still visits his underground lair in an abandoned subway tunnel, where his hovercraft and armored hero suit gather dust; Silk Specter (the daughter of the 1940's original); the embittered Comedian, who embarks on a second career doing government dirty work; and the regal, aloof Ozymandias, reputedly the world's smartest man and one of its richest, as well, having written a tell-all book and reaped the rewards of merchandising his former identity. As the story opens, the sinister Rorshach, an outcast, even among his fellow hero has-beens, and a suspected psychotic, investigates the Comedian's murder. From there, Moore's ingeniously conceived dark plot and complex, chilling characterizations coupled with famed illustrator Dave Gibbons' no-pen-stroke-wasted illustrations draw you into a can't-put-it-down encounter with a creative imagination way ahead of its time. Sometimes cynical, other times sympathetic, Moore's enigmatic commentary on the human condition has earned its high place in the pantheon of popular literature. He asks the question with which I began this review, and leaves us to comtemplate its implications.

So far, however, this is a book review. And I am among those who, having read the book first, rarely think the movie version compares well. But director Taylor's effort proved to be an admirable exception, despite some probably inevitable shortcomings. Visually, the film is startling and stunningly faithful to Gibbon's vision. Gibbon, in fact, was on hand to help Taylor and a small army of CG technicians recreate Moore's dark world and his masked characters with the kind of obsessive faithfulness to detail that was simply not available to film makers of previous generations. And the script writers managed to include in their screenplay much of the story's interwoven fabric, by deftly rearranging and carefully abbreviating lengthier flashbacks and dialogue taken from the book. Nevertheless, several of the book's more inventive devices are missing. The saddest omission is that of a parallel terror tale involving a doomed pirate that illuminates Moore's main narrative. (Ironically, its told in a comic book read by a bit character who haunts a local newstand.) Despite a number of missing elements, the movie is long by Hollywood standards (2 hrs, 43 min), but as I told my son on the way home in the car, I'd have sat through four hours to get more of the book on film.

That said, the film is faithful, at least in spirit, to the story original and, fanboy critic protestations notwithstanding, it delivers. It made me laugh, recoil in horror and relate in all the right places, and think about bigger things, as Moore intended. And it moved me to tears twice — something the book did not do. Taylor's excision of the "aliens" element at the end (can't say more without a spoiler alert) is, in my opinion, an improvement not a problem. Whether its a winner at the box office or not (early returns favor the former), it'll certainly collect my $26.95 for the deluxe two-disc boxed set when it comes out on DVD.

Bottomline? I suspect that Watchmen will narrowly miss the cut as great art when my son's son's kids look back. And its dark vision, violence and nudity (the movie is rated "R") will put some people off. Let me also make clear that I do not necessarily agree with Moore's cataclysmic vision of life on earth nor do I subscribe to the remedy the story's unlikely hero/villian ultimately implements for its troubles. (In the ironic final scene, Moore suggests his own ambivalence.) But the book and the film are an important window into the philosophical universe inhabited by this generation — a generation that has confronted, recoiled from and begun to accept its shadow side earlier than most, yet still believes that truth — even dark truth — is worth fighting for and a dying world of broken people is worth saving.