Graduates Versus Oligarchs
By PAUL KRUGMAN
Ben Bernanke's maiden Congressional testimony as chairman of the Federal Reserve was, everyone agrees, superb. He didn't put a foot wrong on monetary or fiscal policy.
But Mr. Bernanke did stumble at one point. Responding to a question from Representative Barney Frank about income inequality, he declared that "the most important factor" in rising inequality "is the rising skill premium, the increased return to education."
That's a fundamental misreading of what's happening to American society. What we're seeing isn't the rise of a fairly broad class of knowledge workers. Instead, we're seeing the rise of a narrow oligarchy: income and wealth are becoming increasingly concentrated in the hands of a small, privileged elite.
I think of Mr. Bernanke's position, which one hears all the time, as the 80-20 fallacy. It's the notion that the winners in our increasingly unequal society are a fairly large group — that the 20 percent or so of American workers who have the skills to take advantage of new technology and globalization are pulling away from the 80 percent who don't have these skills.
The truth is quite different. Highly educated workers have done better than those with less education, but a college degree has hardly been a ticket to big income gains. The 2006 Economic Report of the President tells us that the real earnings of college graduates actually fell more than 5 percent between 2000 and 2004. Over the longer stretch from 1975 to 2004 the average earnings of college graduates rose, but by less than 1 percent per year.
So who are the winners from rising inequality? It's not the top 20 percent, or even the top 10 percent. The big gains have gone to a much smaller, much richer group than that.
A new research paper by Ian Dew-Becker and Robert Gordon of Northwestern University, "Where Did the Productivity Growth Go?," gives the details. Between 1972 and 2001 the wage and salary income of Americans at the 90th percentile of the income distribution rose only 34 percent, or about 1 percent per year. So being in the top 10 percent of the income distribution, like being a college graduate, wasn't a ticket to big income gains.
But income at the 99th percentile rose 87 percent; income at the 99.9th percentile rose 181 percent; and income at the 99.99th percentile rose 497 percent.
No, that's not a misprint.
Just to give you a sense of who we're talking about: the nonpartisan Tax Policy Center estimates that this year the 99th percentile will correspond to an income of $402,306, and the 99.9th percentile to an income of $1,672,726. The center doesn't give a number for the 99.99th percentile, but it's probably well over $6 million a year.
Why would someone as smart and well informed as Mr. Bernanke get the nature of growing inequality wrong? Because the fallacy he fell into tends to dominate polite discussion about income trends, not because it's true, but because it's comforting. The notion that it's all about returns to education suggests that nobody is to blame for rising inequality, that it's just a case of supply and demand at work. And it also suggests that the way to mitigate inequality is to improve our educational system — and better education is a value to which just about every politician in America pays at least lip service.
The idea that we have a rising oligarchy is much more disturbing. It suggests that the growth of inequality may have as much to do with power relations as it does with market forces. Unfortunately, that's the real story.
Should we be worried about the increasingly oligarchic nature of American society? Yes, and not just because a rising economic tide has failed to lift most boats. Both history and modern experience tell us that highly unequal societies also tend to be highly corrupt. There's an arrow of causation that runs from diverging income trends to Jack Abramoff and the K Street project.
And I'm with Alan Greenspan, who — surprisingly, given his libertarian roots — has repeatedly warned that growing inequality poses a threat to "democratic society."
It may take some time before we muster the political will to counter that threat. But the first step toward doing something about inequality is to abandon the 80-20 fallacy. It's time to face up to the fact that rising inequality is driven by the giant income gains of a tiny elite, not the modest gains of college graduates.
Monday, February 27
War, Inc. - Ike Saw it Coming
By BOB HERBERT
Early in the documentary film "Why We Fight," Wilton Sekzer, a retired New York City police officer whose son was killed in the World Trade Center attack, describes his personal feelings in the immediate aftermath of Sept. 11.
"Somebody had to pay for this," he says. "Somebody had to pay for 9/11. ... I wanna see their bodies stacked up for what they did. For taking my son."
Lost in the agony of his grief, Mr. Sekzer wanted revenge. He wanted the government to go after the bad guys, and when the government said the bad guys were in Iraq, he didn't argue.
For most of his life Mr. Sekzer was a patriot straight out of central casting. His view was always "If the bugle calls, you go." When he was 21 he was a gunner on a helicopter in Vietnam. He didn't question his country's motives. He was more than willing to place his trust in the leadership of the nation he loved.
"Why We Fight," a thoughtful, first-rate movie directed by Eugene Jarecki, is largely about how misplaced that trust has become. The central figure in the film is not Mr. Jarecki, but Dwight Eisenhower, the Republican president who had been the supreme Allied commander in Europe in World War II, and who famously warned us at the end of his second term about the profound danger inherent in the rise of the military-industrial complex.
Ike warned us, but we didn't listen. That's the theme the movie explores.
Eisenhower delivered his farewell address to a national television and radio audience in January 1961. "This conjunction of an immense military establishment and a large arms industry is new in the American experience," he said. He recognized that this development was essential to the defense of the nation. But he warned that "we must not fail to comprehend its grave implications."
"The potential for the disastrous rise of misplaced power exists and will persist," he said. "We must never let the weight of this combination endanger our liberties or democratic processes." It was as if this president, who understood war as well or better than any American who ever lived, were somehow able to peer into the future and see the tail of the military-industrial complex wagging the dog of American life, with inevitably disastrous consequences.
The endless billions to be reaped from the horrors of war are a perennial incentive to invest in the war machine and to keep those wars a-coming. "His words have unfortunately come true," says Senator John McCain in the film. "He was worried that priorities are set by what benefits corporations as opposed to what benefits the country."
The way you keep the wars coming is to keep the populace in a state of perpetual fear. That allows you to continue the insane feeding of the military-industrial complex at the expense of the rest of the nation's needs. "Before long," said Mr. Jarecki in an interview, "the military ends up so overempowered that the rest of your national life has been allowed to atrophy."
In one of the great deceptive maneuvers in U.S. history, the military-industrial complex (with George W. Bush and Dick Cheney as chairman and C.E.O., respectively) took its eye off the real enemy in Afghanistan and launched the pointless but far more remunerative war in Iraq.
If you want to get a chill, just consider the tragic chaos in present-day Iraq (seven G.I.'s were killed on the day I went to see "Why We Fight") and then listen to Susan Eisenhower in the film recalling a quotation attributed to her grandfather: "God help this country when somebody sits at this desk who doesn't know as much about the military as I do."
The military-industrial complex has become so pervasive that it is now, as one of the figures in the movie notes, all but invisible. Its missions and priorities are poorly understood by most Americans, and frequently counter to their interests.
Near the end of the movie, Mr. Sekzer, the New York cop who lost his son on Sept. 11, describes his reaction to President Bush's belated acknowledgment that "we've had no evidence that Saddam Hussein was involved" in the Sept. 11 attacks.
"What the hell did we go in there for?" Mr. Sekzer asks.
Unable to hide his bitterness, he says: "The government exploited my feelings of patriotism, of a deep desire for revenge for what happened to my son. But I was so insane with wanting to get even, I was willing to believe anything."
Early in the documentary film "Why We Fight," Wilton Sekzer, a retired New York City police officer whose son was killed in the World Trade Center attack, describes his personal feelings in the immediate aftermath of Sept. 11.
"Somebody had to pay for this," he says. "Somebody had to pay for 9/11. ... I wanna see their bodies stacked up for what they did. For taking my son."
Lost in the agony of his grief, Mr. Sekzer wanted revenge. He wanted the government to go after the bad guys, and when the government said the bad guys were in Iraq, he didn't argue.
For most of his life Mr. Sekzer was a patriot straight out of central casting. His view was always "If the bugle calls, you go." When he was 21 he was a gunner on a helicopter in Vietnam. He didn't question his country's motives. He was more than willing to place his trust in the leadership of the nation he loved.
"Why We Fight," a thoughtful, first-rate movie directed by Eugene Jarecki, is largely about how misplaced that trust has become. The central figure in the film is not Mr. Jarecki, but Dwight Eisenhower, the Republican president who had been the supreme Allied commander in Europe in World War II, and who famously warned us at the end of his second term about the profound danger inherent in the rise of the military-industrial complex.
Ike warned us, but we didn't listen. That's the theme the movie explores.
Eisenhower delivered his farewell address to a national television and radio audience in January 1961. "This conjunction of an immense military establishment and a large arms industry is new in the American experience," he said. He recognized that this development was essential to the defense of the nation. But he warned that "we must not fail to comprehend its grave implications."
"The potential for the disastrous rise of misplaced power exists and will persist," he said. "We must never let the weight of this combination endanger our liberties or democratic processes." It was as if this president, who understood war as well or better than any American who ever lived, were somehow able to peer into the future and see the tail of the military-industrial complex wagging the dog of American life, with inevitably disastrous consequences.
The endless billions to be reaped from the horrors of war are a perennial incentive to invest in the war machine and to keep those wars a-coming. "His words have unfortunately come true," says Senator John McCain in the film. "He was worried that priorities are set by what benefits corporations as opposed to what benefits the country."
The way you keep the wars coming is to keep the populace in a state of perpetual fear. That allows you to continue the insane feeding of the military-industrial complex at the expense of the rest of the nation's needs. "Before long," said Mr. Jarecki in an interview, "the military ends up so overempowered that the rest of your national life has been allowed to atrophy."
In one of the great deceptive maneuvers in U.S. history, the military-industrial complex (with George W. Bush and Dick Cheney as chairman and C.E.O., respectively) took its eye off the real enemy in Afghanistan and launched the pointless but far more remunerative war in Iraq.
If you want to get a chill, just consider the tragic chaos in present-day Iraq (seven G.I.'s were killed on the day I went to see "Why We Fight") and then listen to Susan Eisenhower in the film recalling a quotation attributed to her grandfather: "God help this country when somebody sits at this desk who doesn't know as much about the military as I do."
The military-industrial complex has become so pervasive that it is now, as one of the figures in the movie notes, all but invisible. Its missions and priorities are poorly understood by most Americans, and frequently counter to their interests.
Near the end of the movie, Mr. Sekzer, the New York cop who lost his son on Sept. 11, describes his reaction to President Bush's belated acknowledgment that "we've had no evidence that Saddam Hussein was involved" in the Sept. 11 attacks.
"What the hell did we go in there for?" Mr. Sekzer asks.
Unable to hide his bitterness, he says: "The government exploited my feelings of patriotism, of a deep desire for revenge for what happened to my son. But I was so insane with wanting to get even, I was willing to believe anything."
War, Inc. - Army to Pay Halliburton Unit Most Costs Disputed by Audit
Army to Pay Halliburton Unit Most Costs Disputed by Audit
By JAMES GLANZ
The Army has decided to reimburse a Halliburton subsidiary for nearly all of its disputed costs on a $2.41 billion no-bid contract to deliver fuel and repair oil equipment in Iraq, even though the Pentagon's own auditors had identified more than $250 million in charges as potentially excessive or unjustified.
The Army said in response to questions on Friday that questionable business practices by the subsidiary, Kellogg Brown & Root, had in some cases driven up the company's costs. But in the haste and peril of war, it had largely done as well as could be expected, the Army said, and aside from a few penalties, the government was compelled to reimburse the company for its costs.
Under the type of contract awarded to the company, "the contractor is not required to perform perfectly to be entitled to reimbursement," said Rhonda James, a spokeswoman for the southwestern division of the United States Army Corps of Engineers, based in Dallas, where the contract is administered.
The contract has been the subject of intense scrutiny after disclosures in 2003 that it had been awarded without competitive bidding. That produced criticism from Congressional Democrats and others that the company had benefited from its connection with Dick Cheney, who was Halliburton's chief executive before becoming vice president.
Later that year auditors began focusing on the fuel deliveries under the contract, finding that the fuel transportation costs that the company was charging the Army were in some cases nearly triple what others were charging to do the same job. But Kellogg Brown & Root, which has consistently maintained that its costs were justified, characterized the Army's decision as an official repudiation of those criticisms.
"Once all the facts were fully examined, it is clear, and now confirmed, that KBR performed this work appropriately per the client's direction and within the contract terms," said Cathy Mann, a company spokeswoman, in a written statement on the decision. The company's charges, she said, "were deemed properly incurred."
The Pentagon's Defense Contract Audit Agency had questioned $263 million in costs for fuel deliveries, pipeline repairs and other tasks that auditors said were potentially inflated or unsupported by documentation. But the Army decided to pay all but $10.1 million of those contested costs, which were mostly for trucking fuel from Kuwait and Turkey.
That means the Army is withholding payment on just 3.8 percent of the charges questioned by the Pentagon audit agency, which is far below the rate at which the agency's recommendation is usually followed or sustained by the military — the so-called "sustention rate."
Figures provided by the Pentagon audit agency on thousands of military contracts over the past three years show how far the Halliburton decision lies outside the norm.
In 2003, the agency's figures show, the military withheld an average of 66.4 percent of what the auditors had recommended, while in 2004 the figure was 75.2 percent and in 2005 it was 56.4 percent.
Rick Barton, co-director of the postconflict reconstruction project at the Center for Strategic and International Studies in Washington, said despite the difficulties of doing business in a war zone, the low rate of recovery on such huge and widely disputed charges was hard to understand. "To think that it's near zero is ridiculous when you're talking these kinds of numbers," he said.
The Halliburton contract is referred to as a "cost-plus" agreement, meaning that after the company recovers its costs, it also receives various markups and award fees. Although the markups and fees are difficult to calculate exactly using the Army figures, they appear to be about $100 million.
One of Halliburton's most persistent critics, Representative Henry A. Waxman, a California Democrat who is the ranking minority member of the House Committee on Government Reform, said in a written statement about the Army's decision, "Halliburton gouged the taxpayer, government auditors caught the company red-handed, yet the Pentagon ignored the auditors and paid Halliburton hundreds of millions of dollars and a huge bonus."
About $208 million of the disputed charges was mostly related to the cost of importing fuel, which was at the heart of the controversy surrounding the contract. Kellogg Brown & Root hired a little-known Kuwaiti company, Altanmia, to transport fuel in enormous truck convoys. The Pentagon auditors found that in part because of the transportation fees that Kellogg Brown & Root agreed to pay Altanmia, the cost for a gallon of gasoline was roughly 40 percent higher than what the American military paid when it did the job itself — under a separate contract it had negotiated with Altanmia.
The Army said in a written statement that it had largely accepted Kellogg Brown & Root's assertions that costs had been driven up by factors beyond its control — the exigencies of war and the hard-line negotiating stance of the state-owned Kuwait Petroleum Corporation. The Army said the Kuwaiti fuel company blocked attempts by Kellogg Brown & Root to renegotiate its transportation contract with Altanmia. In the end, the Army decided to pay the Halliburton subsidiary all but $3.81 million of the $208 million in fuel-related costs questioned by auditors.
The Kellogg Brown & Root contract, called Restore Iraqi Oil, or RIO, will be paid with about $900 million of American taxpayer money and $1.5 billion of Iraqi oil proceeds and money seized from Saddam Hussein's government. Official criticism of the work became so intense that in November, an auditing board sponsored by the United Nations recommended that the United States repay some or all of the $208 million related to the alleged fuel overcharges — an allegation Halliburton says has never been justified.
In fact, Ms. Mann said, the Army's decision clearly showed that "any claims that the figures contained in these audit reports are 'overcharges' are uninformed and flat wrong." She said that the fuel charges themselves had been 100 percent reimbursed and that the reductions all came from adjustments on administrative costs associated with that mission.
Still, the Army conceded that some of the criticisms of the company's business practices were legitimate. As a result, the Army said, it would exclude about half of the auditors' questioned charges from the amount used to derive the markups and fees, which are calculated as a sliding percentage of the costs. That decision could cost the company a maximum of about $7 million.
Ms. James, the Corps of Engineers spokeswoman, said that in addition to the other modest penalties that Kellogg Brown & Root had been assessed by the Army's contracting officers, the sliding percentages on some of the fees had been lowered by unspecified amounts to reflect shortcomings in the company's dealings in Iraq. "All fees were awarded in accordance with the award fee plan set out in the contract, which placed more emphasis on timely mission accomplishment than on cost control and paperwork," Ms. James said.
Mr. Barton, of the Center for Strategic and International Studies, said that with the relatively small penalties paid by the company for falling short in its performance in Iraq, it was hard to see what the Army's scrutiny of the company's practices had amounted to in the end.
"When they say, 'We questioned their business model or their business decisions' — well, yeah, so what?" Mr. Barton said. "You questioned it but there was no result."
In answer to written questions, a spokesman for the Defense Contract Audit Agency, Lt. Col. Brian Maka, said the settlement of the disputed charges was based on "broader business case considerations" beyond just Pentagon audits.
But when asked whether the Army's decision reflected on the quality of the audits, Colonel Maka said only that the agency "has no indication of problems with the audit process," and he referred questions on the settlement itself to the Army.
A former senior Defense Department manager knowledgeable about the audits and the related contracting issues said, "That's as close as D.C.A.A. can get to saying, 'We're not happy with it either.' "
Because of the size of the contract and the contention surrounding Halliburton's dealings with the government, the RIO audits were carried out by the agency's top personnel and were subjected to extraordinarily thorough reviews, the former manager said.
This is unlikely to be the last time the Army and Halliburton meet over negotiated costs. On a separate contract in Iraq, for logistics support to the United States military, more than $11 billion had been disbursed to Kellogg Brown & Root by mid-January, according to the Army Field Support Command, based in Rock Island, Ill. Pentagon auditors have begun scrutinizing that contract as well.
By JAMES GLANZ
The Army has decided to reimburse a Halliburton subsidiary for nearly all of its disputed costs on a $2.41 billion no-bid contract to deliver fuel and repair oil equipment in Iraq, even though the Pentagon's own auditors had identified more than $250 million in charges as potentially excessive or unjustified.
The Army said in response to questions on Friday that questionable business practices by the subsidiary, Kellogg Brown & Root, had in some cases driven up the company's costs. But in the haste and peril of war, it had largely done as well as could be expected, the Army said, and aside from a few penalties, the government was compelled to reimburse the company for its costs.
Under the type of contract awarded to the company, "the contractor is not required to perform perfectly to be entitled to reimbursement," said Rhonda James, a spokeswoman for the southwestern division of the United States Army Corps of Engineers, based in Dallas, where the contract is administered.
The contract has been the subject of intense scrutiny after disclosures in 2003 that it had been awarded without competitive bidding. That produced criticism from Congressional Democrats and others that the company had benefited from its connection with Dick Cheney, who was Halliburton's chief executive before becoming vice president.
Later that year auditors began focusing on the fuel deliveries under the contract, finding that the fuel transportation costs that the company was charging the Army were in some cases nearly triple what others were charging to do the same job. But Kellogg Brown & Root, which has consistently maintained that its costs were justified, characterized the Army's decision as an official repudiation of those criticisms.
"Once all the facts were fully examined, it is clear, and now confirmed, that KBR performed this work appropriately per the client's direction and within the contract terms," said Cathy Mann, a company spokeswoman, in a written statement on the decision. The company's charges, she said, "were deemed properly incurred."
The Pentagon's Defense Contract Audit Agency had questioned $263 million in costs for fuel deliveries, pipeline repairs and other tasks that auditors said were potentially inflated or unsupported by documentation. But the Army decided to pay all but $10.1 million of those contested costs, which were mostly for trucking fuel from Kuwait and Turkey.
That means the Army is withholding payment on just 3.8 percent of the charges questioned by the Pentagon audit agency, which is far below the rate at which the agency's recommendation is usually followed or sustained by the military — the so-called "sustention rate."
Figures provided by the Pentagon audit agency on thousands of military contracts over the past three years show how far the Halliburton decision lies outside the norm.
In 2003, the agency's figures show, the military withheld an average of 66.4 percent of what the auditors had recommended, while in 2004 the figure was 75.2 percent and in 2005 it was 56.4 percent.
Rick Barton, co-director of the postconflict reconstruction project at the Center for Strategic and International Studies in Washington, said despite the difficulties of doing business in a war zone, the low rate of recovery on such huge and widely disputed charges was hard to understand. "To think that it's near zero is ridiculous when you're talking these kinds of numbers," he said.
The Halliburton contract is referred to as a "cost-plus" agreement, meaning that after the company recovers its costs, it also receives various markups and award fees. Although the markups and fees are difficult to calculate exactly using the Army figures, they appear to be about $100 million.
One of Halliburton's most persistent critics, Representative Henry A. Waxman, a California Democrat who is the ranking minority member of the House Committee on Government Reform, said in a written statement about the Army's decision, "Halliburton gouged the taxpayer, government auditors caught the company red-handed, yet the Pentagon ignored the auditors and paid Halliburton hundreds of millions of dollars and a huge bonus."
About $208 million of the disputed charges was mostly related to the cost of importing fuel, which was at the heart of the controversy surrounding the contract. Kellogg Brown & Root hired a little-known Kuwaiti company, Altanmia, to transport fuel in enormous truck convoys. The Pentagon auditors found that in part because of the transportation fees that Kellogg Brown & Root agreed to pay Altanmia, the cost for a gallon of gasoline was roughly 40 percent higher than what the American military paid when it did the job itself — under a separate contract it had negotiated with Altanmia.
The Army said in a written statement that it had largely accepted Kellogg Brown & Root's assertions that costs had been driven up by factors beyond its control — the exigencies of war and the hard-line negotiating stance of the state-owned Kuwait Petroleum Corporation. The Army said the Kuwaiti fuel company blocked attempts by Kellogg Brown & Root to renegotiate its transportation contract with Altanmia. In the end, the Army decided to pay the Halliburton subsidiary all but $3.81 million of the $208 million in fuel-related costs questioned by auditors.
The Kellogg Brown & Root contract, called Restore Iraqi Oil, or RIO, will be paid with about $900 million of American taxpayer money and $1.5 billion of Iraqi oil proceeds and money seized from Saddam Hussein's government. Official criticism of the work became so intense that in November, an auditing board sponsored by the United Nations recommended that the United States repay some or all of the $208 million related to the alleged fuel overcharges — an allegation Halliburton says has never been justified.
In fact, Ms. Mann said, the Army's decision clearly showed that "any claims that the figures contained in these audit reports are 'overcharges' are uninformed and flat wrong." She said that the fuel charges themselves had been 100 percent reimbursed and that the reductions all came from adjustments on administrative costs associated with that mission.
Still, the Army conceded that some of the criticisms of the company's business practices were legitimate. As a result, the Army said, it would exclude about half of the auditors' questioned charges from the amount used to derive the markups and fees, which are calculated as a sliding percentage of the costs. That decision could cost the company a maximum of about $7 million.
Ms. James, the Corps of Engineers spokeswoman, said that in addition to the other modest penalties that Kellogg Brown & Root had been assessed by the Army's contracting officers, the sliding percentages on some of the fees had been lowered by unspecified amounts to reflect shortcomings in the company's dealings in Iraq. "All fees were awarded in accordance with the award fee plan set out in the contract, which placed more emphasis on timely mission accomplishment than on cost control and paperwork," Ms. James said.
Mr. Barton, of the Center for Strategic and International Studies, said that with the relatively small penalties paid by the company for falling short in its performance in Iraq, it was hard to see what the Army's scrutiny of the company's practices had amounted to in the end.
"When they say, 'We questioned their business model or their business decisions' — well, yeah, so what?" Mr. Barton said. "You questioned it but there was no result."
In answer to written questions, a spokesman for the Defense Contract Audit Agency, Lt. Col. Brian Maka, said the settlement of the disputed charges was based on "broader business case considerations" beyond just Pentagon audits.
But when asked whether the Army's decision reflected on the quality of the audits, Colonel Maka said only that the agency "has no indication of problems with the audit process," and he referred questions on the settlement itself to the Army.
A former senior Defense Department manager knowledgeable about the audits and the related contracting issues said, "That's as close as D.C.A.A. can get to saying, 'We're not happy with it either.' "
Because of the size of the contract and the contention surrounding Halliburton's dealings with the government, the RIO audits were carried out by the agency's top personnel and were subjected to extraordinarily thorough reviews, the former manager said.
This is unlikely to be the last time the Army and Halliburton meet over negotiated costs. On a separate contract in Iraq, for logistics support to the United States military, more than $11 billion had been disbursed to Kellogg Brown & Root by mid-January, according to the Army Field Support Command, based in Rock Island, Ill. Pentagon auditors have begun scrutinizing that contract as well.
Sunday, February 26
A Mind Is a Terrible Thing to Measure
By ADAM PHILLIPS
London
PSYCHOTHERAPY is having yet another identity crisis. It has manifested itself in two recent trends in the profession in America: the first involves trying to make therapy into more of a "hard science" by putting a new emphasis on measurable factors; the other is a growing belief among therapists that the standard practice of using talk therapy to discover traumas in a patient's past is not only unnecessary but can be injurious.
That psychotherapists of various orientations find themselves under pressure to prove to themselves and to society that they are doing a hard-core science — which was a leading theme of the landmark Evolution of Psychotherapy Conference in California in December — is not really surprising. Given the prestige and trust the modern world gives to scientific standards, psychotherapists, who always have to measure themselves against the medical profession, are going to want to demonstrate that they, too, deal in the predictable; that they, too, can provide evidence for the value of what they do.
And, obviously, if psychotherapy is going to attain scientific credibility, it won't do to involve such wishy-washy practices as "going back to childhood" or "reconstructing the past" — terms that when used with appropriate scorn can sound as though a person's past was akin to the past lives New Agers like to talk about.
Since at least the middle of the 19th century, Western societies have been divided between religious truth and scientific truth, but none of the new psychotherapies are trying to prove they are genuine religions. Nor is there much talk, outside of university literature departments, of psychotherapy trying to inhabit the middle ground of arts, in which truth and usefulness have traditionally been allowed a certain latitude (nobody measures Shakespeare or tries to prove his value).
It is, so to speak, symptomatic that psychotherapists are so keen to legitimize themselves as scientists: they want to fit in rather than create the taste by which they might be judged. One of the good things psychotherapy can do, like the arts, is show us the limits of what science can do for our welfare. The scientific method alone is never going to be enough, especially when we are working out how to live and who we can be.
In the so-called arts it has always been acknowledged that many of the things we value most — the gods and God, love and sexuality, mourning and amusement, character and inspiration, the past and the future — are neither measurable or predictable. Indeed, this may be one of the reasons they are so abidingly important to us. The things we value most, just like the things we most fear, tend to be those we have least control over.
This is not a reason to stop trying to control things — we should, for example, be doing everything we can to control pain — but it is a reason to work out in which areas of our lives control is both possible and beneficial. Trying to predict the unpredictable, like trying to will what cannot be willed, drives people crazy.
Just as we cannot know beforehand the effect on us of reading a book or of listening to music, every psychotherapy treatment, indeed every session, is unpredictable. Indeed, if it is not, it is a form of bullying, it is indoctrination. It is not news that most symptoms of so-called mental illness are efforts to control the environment, just like the science that claims to study them.
It would clearly be naïve for psychotherapists to turn a blind eye to science, or to be "against" scientific methodology. But the attempt to present psychotherapy as a hard science is merely an attempt to make it a convincing competitor in the marketplace. It is a sign, in other words, of a misguided wish to make psychotherapy both respectable and servile to the very consumerism it is supposed to help people deal with. (Psychotherapy turns up historically at the point at which traditional societies begin to break down and consumer capitalism begins to take hold.)
If psychotherapy has anything to offer — and this should always be in question — it should be something aside from the dominant trends in the culture. And this means now that its practitioners should not be committed either to making money or to trivializing the past or to finding a science of the soul.
If you have an eye test, if you buy a car, there are certain things you are entitled to expect. Your money buys you some minimal guarantees, some reliable results. The honest psychotherapist can provide no comparable assurances. She can promise only an informed willingness to listen, and the possibility of helpful comment.
By inviting the patient to talk, at length — and especially to talk about what really troubles him — something is opened up, but neither patient nor therapist can know beforehand what will be said by either of them, nor can they know the consequences of what they will say. Just creating a situation that has the potential to evoke previously repressed memories and thoughts and feelings and desires is an opportunity of immeasurable consequence, both good and bad. No amount of training and research, of statistics-gathering and empathy, can offset that unique uncertainty of the encounter.
As a treatment, psychotherapy is a risk, just as what actually happens in anyone's childhood is always going to be obscure and indefinite, but no less significant for being so. Psychotherapists are people whose experience tells them that certain risks are often worth taking, but more than this they cannot rightly say. There are always going to be casualties of therapy.
Psychotherapy makes use of a traditional wisdom holding that the past matters and that, surprisingly, talking can make people feel better — even if at first, for good reasons, they resist it. There is an appetite to talk and to be listened to, and an appetite to make time for doing those things.
Religion has historically been the language for people to talk about the things that mattered most to them, aided and abetted by the arts. Science has become the language that has helped people to know what they wanted to know, and get what they wanted to get. Psychotherapy has to occupy the difficult middle ground between them, but without taking sides. Since it is narrow-mindedness that we most often suffer from, we need our therapists to resist the allure of the fashionable certainties.
Adam Phillips is a psychoanalyst and the author, most recently, of "Going Sane: Maps of Happiness."
London
PSYCHOTHERAPY is having yet another identity crisis. It has manifested itself in two recent trends in the profession in America: the first involves trying to make therapy into more of a "hard science" by putting a new emphasis on measurable factors; the other is a growing belief among therapists that the standard practice of using talk therapy to discover traumas in a patient's past is not only unnecessary but can be injurious.
That psychotherapists of various orientations find themselves under pressure to prove to themselves and to society that they are doing a hard-core science — which was a leading theme of the landmark Evolution of Psychotherapy Conference in California in December — is not really surprising. Given the prestige and trust the modern world gives to scientific standards, psychotherapists, who always have to measure themselves against the medical profession, are going to want to demonstrate that they, too, deal in the predictable; that they, too, can provide evidence for the value of what they do.
And, obviously, if psychotherapy is going to attain scientific credibility, it won't do to involve such wishy-washy practices as "going back to childhood" or "reconstructing the past" — terms that when used with appropriate scorn can sound as though a person's past was akin to the past lives New Agers like to talk about.
Since at least the middle of the 19th century, Western societies have been divided between religious truth and scientific truth, but none of the new psychotherapies are trying to prove they are genuine religions. Nor is there much talk, outside of university literature departments, of psychotherapy trying to inhabit the middle ground of arts, in which truth and usefulness have traditionally been allowed a certain latitude (nobody measures Shakespeare or tries to prove his value).
It is, so to speak, symptomatic that psychotherapists are so keen to legitimize themselves as scientists: they want to fit in rather than create the taste by which they might be judged. One of the good things psychotherapy can do, like the arts, is show us the limits of what science can do for our welfare. The scientific method alone is never going to be enough, especially when we are working out how to live and who we can be.
In the so-called arts it has always been acknowledged that many of the things we value most — the gods and God, love and sexuality, mourning and amusement, character and inspiration, the past and the future — are neither measurable or predictable. Indeed, this may be one of the reasons they are so abidingly important to us. The things we value most, just like the things we most fear, tend to be those we have least control over.
This is not a reason to stop trying to control things — we should, for example, be doing everything we can to control pain — but it is a reason to work out in which areas of our lives control is both possible and beneficial. Trying to predict the unpredictable, like trying to will what cannot be willed, drives people crazy.
Just as we cannot know beforehand the effect on us of reading a book or of listening to music, every psychotherapy treatment, indeed every session, is unpredictable. Indeed, if it is not, it is a form of bullying, it is indoctrination. It is not news that most symptoms of so-called mental illness are efforts to control the environment, just like the science that claims to study them.
It would clearly be naïve for psychotherapists to turn a blind eye to science, or to be "against" scientific methodology. But the attempt to present psychotherapy as a hard science is merely an attempt to make it a convincing competitor in the marketplace. It is a sign, in other words, of a misguided wish to make psychotherapy both respectable and servile to the very consumerism it is supposed to help people deal with. (Psychotherapy turns up historically at the point at which traditional societies begin to break down and consumer capitalism begins to take hold.)
If psychotherapy has anything to offer — and this should always be in question — it should be something aside from the dominant trends in the culture. And this means now that its practitioners should not be committed either to making money or to trivializing the past or to finding a science of the soul.
If you have an eye test, if you buy a car, there are certain things you are entitled to expect. Your money buys you some minimal guarantees, some reliable results. The honest psychotherapist can provide no comparable assurances. She can promise only an informed willingness to listen, and the possibility of helpful comment.
By inviting the patient to talk, at length — and especially to talk about what really troubles him — something is opened up, but neither patient nor therapist can know beforehand what will be said by either of them, nor can they know the consequences of what they will say. Just creating a situation that has the potential to evoke previously repressed memories and thoughts and feelings and desires is an opportunity of immeasurable consequence, both good and bad. No amount of training and research, of statistics-gathering and empathy, can offset that unique uncertainty of the encounter.
As a treatment, psychotherapy is a risk, just as what actually happens in anyone's childhood is always going to be obscure and indefinite, but no less significant for being so. Psychotherapists are people whose experience tells them that certain risks are often worth taking, but more than this they cannot rightly say. There are always going to be casualties of therapy.
Psychotherapy makes use of a traditional wisdom holding that the past matters and that, surprisingly, talking can make people feel better — even if at first, for good reasons, they resist it. There is an appetite to talk and to be listened to, and an appetite to make time for doing those things.
Religion has historically been the language for people to talk about the things that mattered most to them, aided and abetted by the arts. Science has become the language that has helped people to know what they wanted to know, and get what they wanted to get. Psychotherapy has to occupy the difficult middle ground between them, but without taking sides. Since it is narrow-mindedness that we most often suffer from, we need our therapists to resist the allure of the fashionable certainties.
Adam Phillips is a psychoanalyst and the author, most recently, of "Going Sane: Maps of Happiness."
Saturday, February 25
What can Culture buy us?
What can Culture buy us?
Dany Louise, Art Monthly
During a panel discussion at the 'Art 05' event held in Liverpool in February earlier this year, conversation inevitably turned to Liverpool's designation as Capital of Culture 2008. Chairing the discussion, Alan Yentob wanted to know, 'Who's running the show? Are there any representatives of Capital of Culture here?'. A few seconds of embarrassing silence was heard...
It was left to Peter Mearns, North West Development Agency director of marketing, to fill the vacuum. 'The Capital of Culture year', he said, 'is the fuel that will drive the rocket that will lead the regeneration of Merseyside. We are interested in jobs and people's quality of life. It's not about the art nor culture at all.' Another deafening silence engulfed the auditorium, but of course - for reasons discussed by JJ Charlesworth see (AM243) - no one was prepared to argue with this statement. Panellist Eddie Berg, ex director of FACT, could only thank Mearns for his 'frank position statement' and sum up with the deeply cynical: 'here in Liverpool, it is about what culture can buy us'.
So why is this significant? So far, so very New Labour: art as an instrument of and vehicle for social change, with well-funded quangos using their financial leverage to effect government policy. But while commentators on Merseyside have long suspected that the various agencies, including The Culture Company, have little genuine interest in the arts and the city's arts institutions, it is, to the best of my knowledge, the first time it has been admitted as policy in a public forum. It is a policy fully endorsed by Councillor Mike Storey, Leader of Liverpool City Council, who recently said: 'Being Capital of Culture is about many things, but fundamentally it is about creating jobs and creating confidence.'
Everyone on Merseyside wants 2008 to be a success, but it is a cause of tension in the arts community that the agenda for the year is so overtly concerned with regeneration. While this is expected of the NWDA - since the Regional Development Agencies were set up with an explicit responsibility for the economic development of the regions - it is more surprising that The Culture Company positions itself so firmly in the regeneration camp. After all, the bid was won at least partly on the strength of the city's cultural institutions.
Or perhaps it is not surprising at all. Constructed to be firmly under city council control, with the chief executive of the council as director, senior council officers understand the concepts and deliverables of regeneration in the same way that key players in the arts know their own profession. In keeping with the New Labour zeitgeist, and with a passing familiarity with the sector, they think they can pull it off. But is it hiring the right people to do this? The Liverpool 2008 website www.liverpool08.com) shows that The Culture Company consists of 59 people. By my calculations, a mere eight are in arts jobs. Arguably the most important job of all, that of artistic director, will not be filled on a full-time basis until 2006.
Economic investment and access to cultural activity should enhance and possibly raise the quality of life of Liverpudlians. No one would argue that these are not critically important obligations. However, at present, arts organisations are expected to harness themselves to this aim, with minimal two-way dialogue about the nature of how they, the arts, and the creation of new work can best succeed. The wealth of experience within the arts institutions is not being tapped; questions go unanswered. For example, Lewis Biggs, director of the Liverpool Biennial, points out that 'the Capital of Culture business plan shows large quantities of cash going to secure the arts infrastructure, but no explanation of how much of this is new money as opposed to taking on all the responsibilities of the former City Council arts funding'. Project-based funding is tied to 'themed years', last year around 'faith', in 2005 'the sea'. Projects must adhere to these themes to receive funding. Naturally, every artist, arts and/or community organisation is complying. How can they not?
But it is instructive to compare the Cork Capital of Culture 2005 website (www.cork2005.ie) with the Liverpool 2008 one. Cork talks with integrity and intelligence about its artistic aims, explaining the thinking behind its programming: 'Cork seizes the opportunities offered by the designation European Capital of Culture 2005 in a spirit of engagement with contested ideas. Through our year-wide action in such cultural areas as political discourse, dance, literature, music, theatre, migration and community, we cheerfully join in the debate about what Europe is, and what Europe might become ... We engage in argument as well as art: we thrive on politics as well as music.' Contested ideas? Political discourse? If only The Culture Company would engage at these levels, the city would find it stimulating, pleasurable, satisfying - and, you never know, an exciting creative narrative might emerge from the process. For its part, the fragmented Merseyside arts community could organise itself into an organic unit for the purposes of speaking and negotiating on big issues with an authoritative voice.
I believe the Liverpool situation reflects the national climate. We are witnessing the institutionalisation of art at regional and national levels under New Labour. It is sanctioned, safe and comfortable. Art's function to question received notions, to transgress, to provide a locus for dissent, to elucidate uncomfortable truths and to hold power to account is being eroded. Whose works would an ultra-conservative government burn? What books? It is an age-old political truism that you tame your enemies by giving them a seat at the table. You confer status, influence and financial reward, then sit back and watch them first enjoy, and then defend the status quo.
The bureaucratic collapse of sector boundaries is bringing with it a collapse in understanding of key concepts: so 'art' becomes 'culture' becomes 'museums and libraries' becomes an insignificant part of the 'creative industries' which in turn includes media and ends with sport: lo and behold, the creation of the Department of Culture Media and Sport. Art has always been commodified, but this state monopoly progressively diminishes art's authentic and independent voice, leaving it in danger of becoming meaningless on its own terms. Paradoxically, we find ourselves in the position where there is more funding available than ever before, but large sections of the visual arts may still be marginalised.
Dany Louise, Art Monthly
During a panel discussion at the 'Art 05' event held in Liverpool in February earlier this year, conversation inevitably turned to Liverpool's designation as Capital of Culture 2008. Chairing the discussion, Alan Yentob wanted to know, 'Who's running the show? Are there any representatives of Capital of Culture here?'. A few seconds of embarrassing silence was heard...
It was left to Peter Mearns, North West Development Agency director of marketing, to fill the vacuum. 'The Capital of Culture year', he said, 'is the fuel that will drive the rocket that will lead the regeneration of Merseyside. We are interested in jobs and people's quality of life. It's not about the art nor culture at all.' Another deafening silence engulfed the auditorium, but of course - for reasons discussed by JJ Charlesworth see (AM243) - no one was prepared to argue with this statement. Panellist Eddie Berg, ex director of FACT, could only thank Mearns for his 'frank position statement' and sum up with the deeply cynical: 'here in Liverpool, it is about what culture can buy us'.
So why is this significant? So far, so very New Labour: art as an instrument of and vehicle for social change, with well-funded quangos using their financial leverage to effect government policy. But while commentators on Merseyside have long suspected that the various agencies, including The Culture Company, have little genuine interest in the arts and the city's arts institutions, it is, to the best of my knowledge, the first time it has been admitted as policy in a public forum. It is a policy fully endorsed by Councillor Mike Storey, Leader of Liverpool City Council, who recently said: 'Being Capital of Culture is about many things, but fundamentally it is about creating jobs and creating confidence.'
Everyone on Merseyside wants 2008 to be a success, but it is a cause of tension in the arts community that the agenda for the year is so overtly concerned with regeneration. While this is expected of the NWDA - since the Regional Development Agencies were set up with an explicit responsibility for the economic development of the regions - it is more surprising that The Culture Company positions itself so firmly in the regeneration camp. After all, the bid was won at least partly on the strength of the city's cultural institutions.
Or perhaps it is not surprising at all. Constructed to be firmly under city council control, with the chief executive of the council as director, senior council officers understand the concepts and deliverables of regeneration in the same way that key players in the arts know their own profession. In keeping with the New Labour zeitgeist, and with a passing familiarity with the sector, they think they can pull it off. But is it hiring the right people to do this? The Liverpool 2008 website www.liverpool08.com) shows that The Culture Company consists of 59 people. By my calculations, a mere eight are in arts jobs. Arguably the most important job of all, that of artistic director, will not be filled on a full-time basis until 2006.
Economic investment and access to cultural activity should enhance and possibly raise the quality of life of Liverpudlians. No one would argue that these are not critically important obligations. However, at present, arts organisations are expected to harness themselves to this aim, with minimal two-way dialogue about the nature of how they, the arts, and the creation of new work can best succeed. The wealth of experience within the arts institutions is not being tapped; questions go unanswered. For example, Lewis Biggs, director of the Liverpool Biennial, points out that 'the Capital of Culture business plan shows large quantities of cash going to secure the arts infrastructure, but no explanation of how much of this is new money as opposed to taking on all the responsibilities of the former City Council arts funding'. Project-based funding is tied to 'themed years', last year around 'faith', in 2005 'the sea'. Projects must adhere to these themes to receive funding. Naturally, every artist, arts and/or community organisation is complying. How can they not?
But it is instructive to compare the Cork Capital of Culture 2005 website (www.cork2005.ie) with the Liverpool 2008 one. Cork talks with integrity and intelligence about its artistic aims, explaining the thinking behind its programming: 'Cork seizes the opportunities offered by the designation European Capital of Culture 2005 in a spirit of engagement with contested ideas. Through our year-wide action in such cultural areas as political discourse, dance, literature, music, theatre, migration and community, we cheerfully join in the debate about what Europe is, and what Europe might become ... We engage in argument as well as art: we thrive on politics as well as music.' Contested ideas? Political discourse? If only The Culture Company would engage at these levels, the city would find it stimulating, pleasurable, satisfying - and, you never know, an exciting creative narrative might emerge from the process. For its part, the fragmented Merseyside arts community could organise itself into an organic unit for the purposes of speaking and negotiating on big issues with an authoritative voice.
I believe the Liverpool situation reflects the national climate. We are witnessing the institutionalisation of art at regional and national levels under New Labour. It is sanctioned, safe and comfortable. Art's function to question received notions, to transgress, to provide a locus for dissent, to elucidate uncomfortable truths and to hold power to account is being eroded. Whose works would an ultra-conservative government burn? What books? It is an age-old political truism that you tame your enemies by giving them a seat at the table. You confer status, influence and financial reward, then sit back and watch them first enjoy, and then defend the status quo.
The bureaucratic collapse of sector boundaries is bringing with it a collapse in understanding of key concepts: so 'art' becomes 'culture' becomes 'museums and libraries' becomes an insignificant part of the 'creative industries' which in turn includes media and ends with sport: lo and behold, the creation of the Department of Culture Media and Sport. Art has always been commodified, but this state monopoly progressively diminishes art's authentic and independent voice, leaving it in danger of becoming meaningless on its own terms. Paradoxically, we find ourselves in the position where there is more funding available than ever before, but large sections of the visual arts may still be marginalised.
Friday, February 24
To: Professor@University.edu Subject: Why It's All About Me
By JONATHAN D. GLATER
One student skipped class and then sent the professor an e-mail message asking for copies of her teaching notes. Another did not like her grade, and wrote a petulant message to the professor. Another explained that she was late for a Monday class because she was recovering from drinking too much at a wild weekend party.
Jennifer Schultens, an associate professor of mathematics at the University of California, Davis, received this e-mail message last September from a student in her calculus course: "Should I buy a binder or a subject notebook? Since I'm a freshman, I'm not sure how to shop for school supplies. Would you let me know your recommendations? Thank you!"
At colleges and universities nationwide, e-mail has made professors much more approachable. But many say it has made them too accessible, erasing boundaries that traditionally kept students at a healthy distance.
These days, they say, students seem to view them as available around the clock, sending a steady stream of e-mail messages — from 10 a week to 10 after every class — that are too informal or downright inappropriate.
"The tone that they would take in e-mail was pretty astounding," said Michael J. Kessler, an assistant dean and a lecturer in theology at Georgetown University. " 'I need to know this and you need to tell me right now,' with a familiarity that can sometimes border on imperative."
He added: "It's a real fine balance to accommodate what they need and at the same time maintain a level of legitimacy as an instructor and someone who is institutionally authorized to make demands on them, and not the other way round."
While once professors may have expected deference, their expertise seems to have become just another service that students, as consumers, are buying. So students may have no fear of giving offense, imposing on the professor's time or even of asking a question that may reflect badly on their own judgment.
For junior faculty members, the barrage of e-mail has brought new tension into their work lives, some say, as they struggle with how to respond. Their tenure prospects, they realize, may rest in part on student evaluations of their accessibility.
The stakes are different for professors today than they were even a decade ago, said Patricia Ewick, chairwoman of the sociology department at Clark University in Massachusetts, explaining that "students are constantly asked to fill out evaluations of individual faculty." Students also frequently post their own evaluations on Web sites like ratemyprofessors.com and describe their impressions of their professors on blogs.
Last fall, undergraduate students at Syracuse University set up a group in Facebook.com, an online network for students, and dedicated it to maligning one particular instructor. The students were reprimanded.
Professor Ewick said 10 students in one class e-mailed her drafts of their papers days before they were due, seeking comments. "It's all different levels of presumption," she said. "One is that I'll be able to drop everything and read 250 pages two days before I'm going to get 50 of these."
Kathleen E. Jenkins, a sociology professor at the College of William and Mary in Virginia, said she had even received e-mail requests from students who missed class and wanted copies of her teaching notes.
Alexandra Lahav, an associate professor of law at the University of Connecticut, said she felt pressured by the e-mail messages. "I feel sort of responsible, as if I ought to be on call all the time," she said.
Many professors said they were often uncertain how to react. Professor Schultens, who was asked about buying the notebook, said she debated whether to tell the student that this was not a query that should be directed to her, but worried that "such a message could be pretty scary."
"I decided not to respond at all," she said.
Christopher J. Dede, a professor at the Harvard Graduate School of Education who has studied technology in education, said these e-mail messages showed how students no longer deferred to their professors, perhaps because they realized that professors' expertise could rapidly become outdated.
"The deference was probably driven more by the notion that professors were infallible sources of deep knowledge," Professor Dede said, and that notion has weakened.
Meanwhile, students seem unaware that what they write in e-mail could adversely affect them, Professor Lahav said. She recalled an e-mail message from a student saying that he planned to miss class so he could play with his son. Professor Lahav did not respond.
"It's graduate school, he's an adult human being, he's obviously a parent, and it's not my place to tell him how to run his life," she said.
But such e-mail messages can have consequences, she added. "Students don't understand that what they say in e-mail can make them seem very unprofessional, and could result in a bad recommendation."
Still, every professor interviewed emphasized that instant feedback could be invaluable. A question about a lecture or discussion "is for me an indication of a blind spot, that the student didn't get it," said Austin D. Sarat, a professor of political science at Amherst College.
College students say that e-mail makes it easier to ask questions and helps them to learn. "If the only way I could communicate with my professors was by going to their office or calling them, there would be some sort of ranking or prioritization taking place," said Cory Merrill, 19, a sophomore at Amherst. "Is this question worth going over to the office?"
But student e-mail can go too far, said Robert B. Ahdieh, an associate professor at Emory Law School in Atlanta. He paraphrased some of the comments he had received: "I think you're covering the material too fast, or I don't think we're using the reading as much as we could in class, or I think it would be helpful if you would summarize what we've covered at the end of class in case we missed anything."
Students also use e-mail to criticize one another, Professor Ahdieh said. He paraphrased this comment: "You're spending too much time with my moron classmates and you ought to be focusing on those of us who are getting the material."
Michael Greenstone, an economics professor at the Massachusetts Institute of Technology, said he once received an e-mail message late one evening from a student who had recently come to the realization that he was gay and was struggling to cope.
Professor Greenstone said he eventually helped the student get an appointment with a counselor. "I don't think we would have had the opportunity to discuss his realization and accompanying feelings without e-mail as an icebreaker," he said.
A few professors said they had rules for e-mail and told their students how quickly they would respond, how messages should be drafted and what types of messages they would answer.
Meg Worley, an assistant professor of English at Pomona College in California, said she told students that they must say thank you after receiving a professor's response to an e-mail message.
"One of the rules that I teach my students is, the less powerful person always has to write back," Professor Worley said.
One student skipped class and then sent the professor an e-mail message asking for copies of her teaching notes. Another did not like her grade, and wrote a petulant message to the professor. Another explained that she was late for a Monday class because she was recovering from drinking too much at a wild weekend party.
Jennifer Schultens, an associate professor of mathematics at the University of California, Davis, received this e-mail message last September from a student in her calculus course: "Should I buy a binder or a subject notebook? Since I'm a freshman, I'm not sure how to shop for school supplies. Would you let me know your recommendations? Thank you!"
At colleges and universities nationwide, e-mail has made professors much more approachable. But many say it has made them too accessible, erasing boundaries that traditionally kept students at a healthy distance.
These days, they say, students seem to view them as available around the clock, sending a steady stream of e-mail messages — from 10 a week to 10 after every class — that are too informal or downright inappropriate.
"The tone that they would take in e-mail was pretty astounding," said Michael J. Kessler, an assistant dean and a lecturer in theology at Georgetown University. " 'I need to know this and you need to tell me right now,' with a familiarity that can sometimes border on imperative."
He added: "It's a real fine balance to accommodate what they need and at the same time maintain a level of legitimacy as an instructor and someone who is institutionally authorized to make demands on them, and not the other way round."
While once professors may have expected deference, their expertise seems to have become just another service that students, as consumers, are buying. So students may have no fear of giving offense, imposing on the professor's time or even of asking a question that may reflect badly on their own judgment.
For junior faculty members, the barrage of e-mail has brought new tension into their work lives, some say, as they struggle with how to respond. Their tenure prospects, they realize, may rest in part on student evaluations of their accessibility.
The stakes are different for professors today than they were even a decade ago, said Patricia Ewick, chairwoman of the sociology department at Clark University in Massachusetts, explaining that "students are constantly asked to fill out evaluations of individual faculty." Students also frequently post their own evaluations on Web sites like ratemyprofessors.com and describe their impressions of their professors on blogs.
Last fall, undergraduate students at Syracuse University set up a group in Facebook.com, an online network for students, and dedicated it to maligning one particular instructor. The students were reprimanded.
Professor Ewick said 10 students in one class e-mailed her drafts of their papers days before they were due, seeking comments. "It's all different levels of presumption," she said. "One is that I'll be able to drop everything and read 250 pages two days before I'm going to get 50 of these."
Kathleen E. Jenkins, a sociology professor at the College of William and Mary in Virginia, said she had even received e-mail requests from students who missed class and wanted copies of her teaching notes.
Alexandra Lahav, an associate professor of law at the University of Connecticut, said she felt pressured by the e-mail messages. "I feel sort of responsible, as if I ought to be on call all the time," she said.
Many professors said they were often uncertain how to react. Professor Schultens, who was asked about buying the notebook, said she debated whether to tell the student that this was not a query that should be directed to her, but worried that "such a message could be pretty scary."
"I decided not to respond at all," she said.
Christopher J. Dede, a professor at the Harvard Graduate School of Education who has studied technology in education, said these e-mail messages showed how students no longer deferred to their professors, perhaps because they realized that professors' expertise could rapidly become outdated.
"The deference was probably driven more by the notion that professors were infallible sources of deep knowledge," Professor Dede said, and that notion has weakened.
Meanwhile, students seem unaware that what they write in e-mail could adversely affect them, Professor Lahav said. She recalled an e-mail message from a student saying that he planned to miss class so he could play with his son. Professor Lahav did not respond.
"It's graduate school, he's an adult human being, he's obviously a parent, and it's not my place to tell him how to run his life," she said.
But such e-mail messages can have consequences, she added. "Students don't understand that what they say in e-mail can make them seem very unprofessional, and could result in a bad recommendation."
Still, every professor interviewed emphasized that instant feedback could be invaluable. A question about a lecture or discussion "is for me an indication of a blind spot, that the student didn't get it," said Austin D. Sarat, a professor of political science at Amherst College.
College students say that e-mail makes it easier to ask questions and helps them to learn. "If the only way I could communicate with my professors was by going to their office or calling them, there would be some sort of ranking or prioritization taking place," said Cory Merrill, 19, a sophomore at Amherst. "Is this question worth going over to the office?"
But student e-mail can go too far, said Robert B. Ahdieh, an associate professor at Emory Law School in Atlanta. He paraphrased some of the comments he had received: "I think you're covering the material too fast, or I don't think we're using the reading as much as we could in class, or I think it would be helpful if you would summarize what we've covered at the end of class in case we missed anything."
Students also use e-mail to criticize one another, Professor Ahdieh said. He paraphrased this comment: "You're spending too much time with my moron classmates and you ought to be focusing on those of us who are getting the material."
Michael Greenstone, an economics professor at the Massachusetts Institute of Technology, said he once received an e-mail message late one evening from a student who had recently come to the realization that he was gay and was struggling to cope.
Professor Greenstone said he eventually helped the student get an appointment with a counselor. "I don't think we would have had the opportunity to discuss his realization and accompanying feelings without e-mail as an icebreaker," he said.
A few professors said they had rules for e-mail and told their students how quickly they would respond, how messages should be drafted and what types of messages they would answer.
Meg Worley, an assistant professor of English at Pomona College in California, said she told students that they must say thank you after receiving a professor's response to an e-mail message.
"One of the rules that I teach my students is, the less powerful person always has to write back," Professor Worley said.
Why Doctors So Often Get It Wrong
By DAVID LEONHARDT
ON a weekend day a few years ago, the parents of a 4-year-old boy from rural Georgia brought him to a children's hospital here in north Atlanta. The family had already been through a lot. Their son had been sick for months, with fevers that just would not go away.
The doctors on weekend duty ordered blood tests, which showed that the boy had leukemia. There were a few things about his condition that didn't add up, like the light brown spots on the skin, but the doctors still scheduled a strong course of chemotherapy to start on Monday afternoon. Time, after all, was their enemy.
John Bergsagel, a soft-spoken senior oncologist, remembers arriving at the hospital on Monday morning and having a pile of other cases to get through. He was also bothered by the skin spots, but he agreed that the blood test was clear enough. The boy had leukemia.
"Once you start down one of these clinical pathways," Dr. Bergsagel said, "it's very hard to step off."
What the doctors didn't know was that the boy had a rare form of the disease that chemotherapy does not cure. It makes the symptoms go away for a month or so, but then they return. Worst of all, each round of chemotherapy would bring a serious risk of death, since he was already so weak.
With all the tools available to modern medicine — the blood tests and M.R.I.'s and endoscopes — you might think that misdiagnosis has become a rare thing. But you would be wrong. Studies of autopsies have shown that doctors seriously misdiagnose fatal illnesses about 20 percent of the time. So millions of patients are being treated for the wrong disease.
As shocking as that is, the more astonishing fact may be that the rate has not really changed since the 1930's. "No improvement!" was how an article in the normally exclamation-free Journal of the American Medical Association summarized the situation.
This is the richest country in the world — one where one-seventh of the economy is devoted to health care — and yet misdiagnosis is killing thousands of Americans every year.
How can this be happening? And how is it not a source of national outrage?
A BIG part of the answer is that all of the other medical progress we have made has distracted us from the misdiagnosis crisis.
Any number of diseases that were death sentences just 50 years ago — like childhood leukemia — are often manageable today, thanks to good work done by people like Dr. Bergsagel. The brightly painted pediatric clinic where he practices is a pretty inspiring place on most days, because it's just a detour on the way toward a long, healthy life for four out of five leukemia patients who come here.
But we still could be doing a lot better. Under the current medical system, doctors, nurses, lab technicians and hospital executives are not actually paid to come up with the right diagnosis. They are paid to perform tests and to do surgery and to dispense drugs.
There is no bonus for curing someone and no penalty for failing, except when the mistakes rise to the level of malpractice. So even though doctors can have the best intentions, they have little economic incentive to spend time double-checking their instincts, and hospitals have little incentive to give them the tools to do so.
"You get what you pay for," Mark B. McClellan, who runs Medicare and Medicaid, told me. "And we ought to be paying for better quality."
There are some bits of good news here. Dr. McClellan has set up small pay-for-performance programs in Medicare, and a few insurers are also experimenting. But it isn't nearly a big enough push. We just are not using the power of incentives to save lives. For a politician looking to make the often-bloodless debate over health care come alive, this is a huge opportunity.
Joseph Britto, a former intensive-care doctor, likes to compare medicine's attitude toward mistakes with the airline industry's. At the insistence of pilots, who have the ultimate incentive not to mess up, airlines have studied their errors and nearly eliminated crashes.
"Unlike pilots," Dr. Britto said, "doctors don't go down with their planes."
Dr. Britto was working at a London hospital in 1999 when doctors diagnosed chicken pox in a little girl named Isabel Maude. Only when her organs began shutting down did her doctors realize that she had a potentially fatal flesh-eating virus. Isabel's father, Jason, was so shaken by the experience that he quit his finance job and founded a company — named after his daughter, who is a healthy 10-year-old today — to fight misdiagnosis.
The company sells software that allows doctors to type in a patient's symptoms and, in response, spits out a list of possible causes. It does not replace doctors, but makes sure they can consider some unobvious possibilities that they may not have seen since medical school. Dr. Britto is a top executive.
Not long after the founding of Isabel Healthcare, Dr. Bergsagel in Atlanta stumbled across an article about it and asked to be one of the beta testers. So on that Monday morning, when he couldn't get the inconsistencies in the boy's case out of his mind, he sat down at a computer in a little white room, behind a nurse's station, and entered the symptoms.
Near the top of Isabel's list was a rare form of leukemia that Dr. Bergsagel had never seen before — and that often causes brown skin spots. "It was very much a Eureka moment," he said.
There is no happy ending to the story, because this leukemia has much longer odds than more common kinds. But the boy was spared the misery of pointless chemotherapy and was instead given the only chance he had, a bone marrow transplant. He lived another year and a half.
Today, Dr. Bergsagel uses Isabel a few times a month. The company continues to give him free access. But his colleagues at Children's Healthcare of Atlanta can't use it. The hospital has not bought the service, which costs $80,000 a year for a typical hospital (and $750 for an individual doctor).
Clearly, misdiagnosis costs far more than that. But in the current health care system, hospitals have no way to recoup money they spend on programs like Isabel.
We patients, on the other hand, foot the bill for all those wasted procedures and pointless drugs. So we keep getting them. Does that make any sense?
ON a weekend day a few years ago, the parents of a 4-year-old boy from rural Georgia brought him to a children's hospital here in north Atlanta. The family had already been through a lot. Their son had been sick for months, with fevers that just would not go away.
The doctors on weekend duty ordered blood tests, which showed that the boy had leukemia. There were a few things about his condition that didn't add up, like the light brown spots on the skin, but the doctors still scheduled a strong course of chemotherapy to start on Monday afternoon. Time, after all, was their enemy.
John Bergsagel, a soft-spoken senior oncologist, remembers arriving at the hospital on Monday morning and having a pile of other cases to get through. He was also bothered by the skin spots, but he agreed that the blood test was clear enough. The boy had leukemia.
"Once you start down one of these clinical pathways," Dr. Bergsagel said, "it's very hard to step off."
What the doctors didn't know was that the boy had a rare form of the disease that chemotherapy does not cure. It makes the symptoms go away for a month or so, but then they return. Worst of all, each round of chemotherapy would bring a serious risk of death, since he was already so weak.
With all the tools available to modern medicine — the blood tests and M.R.I.'s and endoscopes — you might think that misdiagnosis has become a rare thing. But you would be wrong. Studies of autopsies have shown that doctors seriously misdiagnose fatal illnesses about 20 percent of the time. So millions of patients are being treated for the wrong disease.
As shocking as that is, the more astonishing fact may be that the rate has not really changed since the 1930's. "No improvement!" was how an article in the normally exclamation-free Journal of the American Medical Association summarized the situation.
This is the richest country in the world — one where one-seventh of the economy is devoted to health care — and yet misdiagnosis is killing thousands of Americans every year.
How can this be happening? And how is it not a source of national outrage?
A BIG part of the answer is that all of the other medical progress we have made has distracted us from the misdiagnosis crisis.
Any number of diseases that were death sentences just 50 years ago — like childhood leukemia — are often manageable today, thanks to good work done by people like Dr. Bergsagel. The brightly painted pediatric clinic where he practices is a pretty inspiring place on most days, because it's just a detour on the way toward a long, healthy life for four out of five leukemia patients who come here.
But we still could be doing a lot better. Under the current medical system, doctors, nurses, lab technicians and hospital executives are not actually paid to come up with the right diagnosis. They are paid to perform tests and to do surgery and to dispense drugs.
There is no bonus for curing someone and no penalty for failing, except when the mistakes rise to the level of malpractice. So even though doctors can have the best intentions, they have little economic incentive to spend time double-checking their instincts, and hospitals have little incentive to give them the tools to do so.
"You get what you pay for," Mark B. McClellan, who runs Medicare and Medicaid, told me. "And we ought to be paying for better quality."
There are some bits of good news here. Dr. McClellan has set up small pay-for-performance programs in Medicare, and a few insurers are also experimenting. But it isn't nearly a big enough push. We just are not using the power of incentives to save lives. For a politician looking to make the often-bloodless debate over health care come alive, this is a huge opportunity.
Joseph Britto, a former intensive-care doctor, likes to compare medicine's attitude toward mistakes with the airline industry's. At the insistence of pilots, who have the ultimate incentive not to mess up, airlines have studied their errors and nearly eliminated crashes.
"Unlike pilots," Dr. Britto said, "doctors don't go down with their planes."
Dr. Britto was working at a London hospital in 1999 when doctors diagnosed chicken pox in a little girl named Isabel Maude. Only when her organs began shutting down did her doctors realize that she had a potentially fatal flesh-eating virus. Isabel's father, Jason, was so shaken by the experience that he quit his finance job and founded a company — named after his daughter, who is a healthy 10-year-old today — to fight misdiagnosis.
The company sells software that allows doctors to type in a patient's symptoms and, in response, spits out a list of possible causes. It does not replace doctors, but makes sure they can consider some unobvious possibilities that they may not have seen since medical school. Dr. Britto is a top executive.
Not long after the founding of Isabel Healthcare, Dr. Bergsagel in Atlanta stumbled across an article about it and asked to be one of the beta testers. So on that Monday morning, when he couldn't get the inconsistencies in the boy's case out of his mind, he sat down at a computer in a little white room, behind a nurse's station, and entered the symptoms.
Near the top of Isabel's list was a rare form of leukemia that Dr. Bergsagel had never seen before — and that often causes brown skin spots. "It was very much a Eureka moment," he said.
There is no happy ending to the story, because this leukemia has much longer odds than more common kinds. But the boy was spared the misery of pointless chemotherapy and was instead given the only chance he had, a bone marrow transplant. He lived another year and a half.
Today, Dr. Bergsagel uses Isabel a few times a month. The company continues to give him free access. But his colleagues at Children's Healthcare of Atlanta can't use it. The hospital has not bought the service, which costs $80,000 a year for a typical hospital (and $750 for an individual doctor).
Clearly, misdiagnosis costs far more than that. But in the current health care system, hospitals have no way to recoup money they spend on programs like Isabel.
We patients, on the other hand, foot the bill for all those wasted procedures and pointless drugs. So we keep getting them. Does that make any sense?
the port contraversy
Osama, Saddam and the Ports
By PAUL KRUGMAN
The storm of protest over the planned takeover of some U.S. port operations by Dubai Ports World doesn't make sense viewed in isolation. The Bush administration clearly made no serious effort to ensure that the deal didn't endanger national security. But that's nothing new — the administration has spent the past four and a half years refusing to do anything serious about protecting the nation's ports.
So why did this latest case of sloppiness and indifference finally catch the public's attention? Because this time the administration has become a victim of its own campaign of fearmongering and insinuation.
Let's go back to the beginning. At 2:40 p.m. on Sept. 11, 2001, Donald Rumsfeld gave military commanders their marching orders. "Judge whether good enough hit S. H. [Saddam Hussein] @ same time — not only UBL [Osama bin Laden]," read an aide's handwritten notes about his instructions. The notes were recently released after a Freedom of Information Act request. "Hard to get a good case," the notes acknowledge. Nonetheless, they say: "Sweep it all up. Things related and not."
So it literally began on Day 1. When terrorists attacked the United States, the Bush administration immediately looked for ways it could exploit the atrocity to pursue unrelated goals — especially, but not exclusively, a war with Iraq.
But to exploit the atrocity, President Bush had to do two things. First, he had to create a climate of fear: Al Qaeda, a real but limited threat, metamorphosed into a vast, imaginary axis of evil threatening America. Second, he had to blur the distinctions between nasty people who actually attacked us and nasty people who didn't.
The administration successfully linked Iraq and 9/11 in public perceptions through a campaign of constant insinuation and occasional outright lies. In the process, it also created a state of mind in which all Arabs were lumped together in the camp of evildoers. Osama, Saddam — what's the difference?
Now comes the ports deal. Mr. Bush assures us that "people don't need to worry about security." But after all those declarations that we're engaged in a global war on terrorism, after all the terror alerts declared whenever the national political debate seemed to be shifting to questions of cronyism, corruption and incompetence, the administration can't suddenly change its theme song to "Don't Worry, Be Happy."
The administration also tells us not to worry about having Arabs control port operations. "I want those who are questioning it," Mr. Bush said, "to step up and explain why all of a sudden a Middle Eastern company is held to a different standard than a Great British company."
He was being evasive, of course. This isn't just a Middle Eastern company; it's a company controlled by the monarchy in Dubai, which is part of the authoritarian United Arab Emirates, one of only three countries that recognized the Taliban as the legitimate ruler of Afghanistan.
But more to the point, after years of systematically suggesting that Arabs who didn't attack us are the same as Arabs who did, the administration can't suddenly turn around and say, "But these are good Arabs."
Finally, the ports affair plays in a subliminal way into the public's awareness — vague but widespread — that Mr. Bush, the self-proclaimed deliverer of democracy to the Middle East, and his family have close personal and financial ties to Middle Eastern rulers. Mr. Bush was photographed holding hands with Crown Prince Abdullah of Saudi Arabia (now King Abdullah), not the emir of Dubai. But an administration that has spent years ridiculing people who try to make such distinctions isn't going to have an easy time explaining the difference.
Mr. Bush shouldn't really be losing his credibility as a terrorism fighter over the ports deal, which, after careful examination (which hasn't happened yet), may turn out to be O.K. Instead, Mr. Bush should have lost his credibility long ago over his diversion of U.S. resources away from the pursuit of Al Qaeda and into an unnecessary war in Iraq, his bungling of that war, and his adoption of a wrongful imprisonment and torture policy that has blackened America's reputation.
But there is, nonetheless, a kind of rough justice in Mr. Bush's current predicament. After 9/11, the American people granted him a degree of trust rarely, if ever, bestowed on our leaders. He abused that trust, and now he is facing a storm of skepticism about his actions — a storm that sweeps up everything, things related and not.
By PAUL KRUGMAN
The storm of protest over the planned takeover of some U.S. port operations by Dubai Ports World doesn't make sense viewed in isolation. The Bush administration clearly made no serious effort to ensure that the deal didn't endanger national security. But that's nothing new — the administration has spent the past four and a half years refusing to do anything serious about protecting the nation's ports.
So why did this latest case of sloppiness and indifference finally catch the public's attention? Because this time the administration has become a victim of its own campaign of fearmongering and insinuation.
Let's go back to the beginning. At 2:40 p.m. on Sept. 11, 2001, Donald Rumsfeld gave military commanders their marching orders. "Judge whether good enough hit S. H. [Saddam Hussein] @ same time — not only UBL [Osama bin Laden]," read an aide's handwritten notes about his instructions. The notes were recently released after a Freedom of Information Act request. "Hard to get a good case," the notes acknowledge. Nonetheless, they say: "Sweep it all up. Things related and not."
So it literally began on Day 1. When terrorists attacked the United States, the Bush administration immediately looked for ways it could exploit the atrocity to pursue unrelated goals — especially, but not exclusively, a war with Iraq.
But to exploit the atrocity, President Bush had to do two things. First, he had to create a climate of fear: Al Qaeda, a real but limited threat, metamorphosed into a vast, imaginary axis of evil threatening America. Second, he had to blur the distinctions between nasty people who actually attacked us and nasty people who didn't.
The administration successfully linked Iraq and 9/11 in public perceptions through a campaign of constant insinuation and occasional outright lies. In the process, it also created a state of mind in which all Arabs were lumped together in the camp of evildoers. Osama, Saddam — what's the difference?
Now comes the ports deal. Mr. Bush assures us that "people don't need to worry about security." But after all those declarations that we're engaged in a global war on terrorism, after all the terror alerts declared whenever the national political debate seemed to be shifting to questions of cronyism, corruption and incompetence, the administration can't suddenly change its theme song to "Don't Worry, Be Happy."
The administration also tells us not to worry about having Arabs control port operations. "I want those who are questioning it," Mr. Bush said, "to step up and explain why all of a sudden a Middle Eastern company is held to a different standard than a Great British company."
He was being evasive, of course. This isn't just a Middle Eastern company; it's a company controlled by the monarchy in Dubai, which is part of the authoritarian United Arab Emirates, one of only three countries that recognized the Taliban as the legitimate ruler of Afghanistan.
But more to the point, after years of systematically suggesting that Arabs who didn't attack us are the same as Arabs who did, the administration can't suddenly turn around and say, "But these are good Arabs."
Finally, the ports affair plays in a subliminal way into the public's awareness — vague but widespread — that Mr. Bush, the self-proclaimed deliverer of democracy to the Middle East, and his family have close personal and financial ties to Middle Eastern rulers. Mr. Bush was photographed holding hands with Crown Prince Abdullah of Saudi Arabia (now King Abdullah), not the emir of Dubai. But an administration that has spent years ridiculing people who try to make such distinctions isn't going to have an easy time explaining the difference.
Mr. Bush shouldn't really be losing his credibility as a terrorism fighter over the ports deal, which, after careful examination (which hasn't happened yet), may turn out to be O.K. Instead, Mr. Bush should have lost his credibility long ago over his diversion of U.S. resources away from the pursuit of Al Qaeda and into an unnecessary war in Iraq, his bungling of that war, and his adoption of a wrongful imprisonment and torture policy that has blackened America's reputation.
But there is, nonetheless, a kind of rough justice in Mr. Bush's current predicament. After 9/11, the American people granted him a degree of trust rarely, if ever, bestowed on our leaders. He abused that trust, and now he is facing a storm of skepticism about his actions — a storm that sweeps up everything, things related and not.
Thursday, February 23
Harvard President Resigns rather than getting Fired
Despite the fact that he had established his intellectual reputation at Harvard, loved the place, and was as devoted as anyone there to the life of the mind, Summers nevertheless managed to persuade much of his constituency that he was an alien in their midst. And this had less to do with his views, or his position in the kulturkampf, than his manner, which was almost comically maladroit. One of Summers' favorite phrases was, "Here's what you're thinking." This would typically be followed by a bravura summation of what his interlocutor was, in fact, thinking. (Harvard professors harbor the vanity that they know very well what they're thinking.) Summers had a gift for arming, rather than disarming, his audience. One of his own aides described for me a famously contentious meeting with Law School faculty at which, he said, "Larry told them he wasn't going to pay any attention to their views, when in fact he was going to be listening to their views." Summers so offended his own preferred candidate to head the Graduate School of Education, whom he subjected to a withering cross-examination, that she changed her mind about taking the position until members of the school interceded.
Mr. Summers did not admit to making any mistakes, but he seemed to acknowledge missteps in his leadership. "I have sought for the last five years to prod and challenge the university to reach for the most ambitious goals in creative ways," he said. "There surely have been times when I could have done this in wiser or more respectful ways."
Mr. Summers did not admit to making any mistakes, but he seemed to acknowledge missteps in his leadership. "I have sought for the last five years to prod and challenge the university to reach for the most ambitious goals in creative ways," he said. "There surely have been times when I could have done this in wiser or more respectful ways."
Sunday, February 19
Francis Fukuyama - "After Neoconservatism" from his new book, "America at the Crossroads"
After Neoconservatism
By FRANCIS FUKUYAMA
Francis Fukuyama teaches at the School of Advanced International Studies at Johns Hopkins University.
As we approach the third anniversary of the onset of the Iraq war, it seems very unlikely that history will judge either the intervention itself or the ideas animating it kindly. By invading Iraq, the Bush administration created a self-fulfilling prophecy: Iraq has now replaced Afghanistan as a magnet, a training ground and an operational base for jihadist terrorists, with plenty of American targets to shoot at. The United States still has a chance of creating a Shiite-dominated democratic Iraq, but the new government will be very weak for years to come; the resulting power vacuum will invite outside influence from all of Iraq's neighbors, including Iran. There are clear benefits to the Iraqi people from the removal of Saddam Hussein's dictatorship, and perhaps some positive spillover effects in Lebanon and Syria. But it is very hard to see how these developments in themselves justify the blood and treasure that the United States has spent on the project to this point.
The so-called Bush Doctrine that set the framework for the administration's first term is now in shambles. The doctrine (elaborated, among other places, in the 2002 National Security Strategy of the United States) argued that, in the wake of the Sept. 11 attacks, America would have to launch periodic preventive wars to defend itself against rogue states and terrorists with weapons of mass destruction; that it would do this alone, if necessary; and that it would work to democratize the greater Middle East as a long-term solution to the terrorist problem. But successful pre-emption depends on the ability to predict the future accurately and on good intelligence, which was not forthcoming, while America's perceived unilateralism has isolated it as never before. It is not surprising that in its second term, the administration has been distancing itself from these policies and is in the process of rewriting the National Security Strategy document.
But it is the idealistic effort to use American power to promote democracy and human rights abroad that may suffer the greatest setback. Perceived failure in Iraq has restored the authority of foreign policy "realists" in the tradition of Henry Kissinger. Already there is a host of books and articles decrying America's naïve Wilsonianism and attacking the notion of trying to democratize the world. The administration's second-term efforts to push for greater Middle Eastern democracy, introduced with the soaring rhetoric of Bush's second Inaugural Address, have borne very problematic fruits. The Islamist Muslim Brotherhood made a strong showing in Egypt's parliamentary elections in November and December. While the holding of elections in Iraq this past December was an achievement in itself, the vote led to the ascendance of a Shiite bloc with close ties to Iran (following on the election of the conservative Mahmoud Ahmadinejad as president of Iran in June). But the clincher was the decisive Hamas victory in the Palestinian election last month, which brought to power a movement overtly dedicated to the destruction of Israel. In his second inaugural, Bush said that "America's vital interests and our deepest beliefs are now one," but the charge will be made with increasing frequency that the Bush administration made a big mistake when it stirred the pot, and that the United States would have done better to stick by its traditional authoritarian friends in the Middle East. Indeed, the effort to promote democracy around the world has been attacked as an illegitimate activity both by people on the left like Jeffrey Sachs and by traditional conservatives like Pat Buchanan.
The reaction against democracy promotion and an activist foreign policy may not end there. Those whom Walter Russell Mead labels Jacksonian conservatives — red-state Americans whose sons and daughters are fighting and dying in the Middle East — supported the Iraq war because they believed that their children were fighting to defend the United States against nuclear terrorism, not to promote democracy. They don't want to abandon the president in the middle of a vicious war, but down the road the perceived failure of the Iraq intervention may push them to favor a more isolationist foreign policy, which is a more natural political position for them. A recent Pew poll indicates a swing in public opinion toward isolationism; the percentage of Americans saying that the United States "should mind its own business" has never been higher since the end of the Vietnam War.
More than any other group, it was the neoconservatives both inside and outside the Bush administration who pushed for democratizing Iraq and the broader Middle East. They are widely credited (or blamed) for being the decisive voices promoting regime change in Iraq, and yet it is their idealistic agenda that in the coming months and years will be the most directly threatened. Were the United States to retreat from the world stage, following a drawdown in Iraq, it would in my view be a huge tragedy, because American power and influence have been critical to the maintenance of an open and increasingly democratic order around the world. The problem with neoconservatism's agenda lies not in its ends, which are as American as apple pie, but rather in the overmilitarized means by which it has sought to accomplish them. What American foreign policy needs is not a return to a narrow and cynical realism, but rather the formulation of a "realistic Wilsonianism" that better matches means to ends.
The Neoconservative Legacy
How did the neoconservatives end up overreaching to such an extent that they risk undermining their own goals? The Bush administration's first-term foreign policy did not flow ineluctably from the views of earlier generations of people who considered themselves neoconservatives, since those views were themselves complex and subject to differing interpretations. Four common principles or threads ran through much of this thought up through the end of the cold war: a concern with democracy, human rights and, more generally, the internal politics of states; a belief that American power can be used for moral purposes; a skepticism about the ability of international law and institutions to solve serious security problems; and finally, a view that ambitious social engineering often leads to unexpected consequences and thereby undermines its own ends.
The problem was that two of these principles were in potential collision. The skeptical stance toward ambitious social engineering — which in earlier years had been applied mostly to domestic policies like affirmative action, busing and welfare — suggested a cautious approach toward remaking the world and an awareness that ambitious initiatives always have unanticipated consequences. The belief in the potential moral uses of American power, on the other hand, implied that American activism could reshape the structure of global politics. By the time of the Iraq war, the belief in the transformational uses of power had prevailed over the doubts about social engineering.
In retrospect, things did not have to develop this way. The roots of neoconservatism lie in a remarkable group of largely Jewish intellectuals who attended City College of New York (C.C.N.Y.) in the mid- to late 1930's and early 1940's, a group that included Irving Kristol, Daniel Bell, Irving Howe, Nathan Glazer and, a bit later, Daniel Patrick Moynihan. The story of this group has been told in a number of places, most notably in a documentary film by Joseph Dorman called "Arguing the World." The most important inheritance from the C.C.N.Y. group was an idealistic belief in social progress and the universality of rights, coupled with intense anti-Communism.
It is not an accident that many in the C.C.N.Y. group started out as Trotskyites. Leon Trotsky was, of course, himself a Communist, but his supporters came to understand better than most people the utter cynicism and brutality of the Stalinist regime. The anti-Communist left, in contrast to the traditional American right, sympathized with the social and economic aims of Communism, but in the course of the 1930's and 1940's came to realize that "real existing socialism" had become a monstrosity of unintended consequences that completely undermined the idealistic goals it espoused. While not all of the C.C.N.Y. thinkers became neoconservatives, the danger of good intentions carried to extremes was a theme that would underlie the life work of many members of this group.
If there was a single overarching theme to the domestic social policy critiques issued by those who wrote for the neoconservative journal The Public Interest, founded by Irving Kristol, Nathan Glazer and Daniel Bell in 1965, it was the limits of social engineering. Writers like Glazer, Moynihan and, later, Glenn Loury argued that ambitious efforts to seek social justice often left societies worse off than before because they either required massive state intervention that disrupted pre-existing social relations (for example, forced busing) or else produced unanticipated consequences (like an increase in single-parent families as a result of welfare). A major theme running through James Q. Wilson's extensive writings on crime was the idea that you could not lower crime rates by trying to solve deep underlying problems like poverty and racism; effective policies needed to focus on shorter-term measures that went after symptoms of social distress (like subway graffiti or panhandling) rather than root causes.
How, then, did a group with such a pedigree come to decide that the "root cause" of terrorism lay in the Middle East's lack of democracy, that the United States had both the wisdom and the ability to fix this problem and that democracy would come quickly and painlessly to Iraq? Neoconservatives would not have taken this turn but for the peculiar way that the cold war ended.
Ronald Reagan was ridiculed by sophisticated people on the American left and in Europe for labeling the Soviet Union and its allies an "evil empire" and for challenging Mikhail Gorbachev not just to reform his system but also to "tear down this wall." His assistant secretary of defense for international security policy, Richard Perle, was denounced as the "prince of darkness" for this uncompromising, hard-line position; his proposal for a double-zero in the intermediate-range nuclear arms negotiations (that is, the complete elimination of medium-range missiles) was attacked as hopelessly out of touch by the bien-pensant centrist foreign-policy experts at places like the Council on Foreign Relations and the State Department. That community felt that the Reaganites were dangerously utopian in their hopes for actually winning, as opposed to managing, the cold war.
And yet total victory in the cold war is exactly what happened in 1989-91. Gorbachev accepted not only the double zero but also deep cuts in conventional forces, and then failed to stop the Polish, Hungarian and East German defections from the empire. Communism collapsed within a couple of years because of its internal moral weaknesses and contradictions, and with regime change in Eastern Europe and the former Soviet Union, the Warsaw Pact threat to the West evaporated.
The way the cold war ended shaped the thinking of supporters of the Iraq war, including younger neoconservatives like William Kristol and Robert Kagan, in two ways. First, it seems to have created an expectation that all totalitarian regimes were hollow at the core and would crumble with a small push from outside. The model for this was Romania under the Ceausescus: once the wicked witch was dead, the munchkins would rise up and start singing joyously about their liberation. As Kristol and Kagan put it in their 2000 book "Present Dangers": "To many the idea of America using its power to promote changes of regime in nations ruled by dictators rings of utopianism. But in fact, it is eminently realistic. There is something perverse in declaring the impossibility of promoting democratic change abroad in light of the record of the past three decades."
This overoptimism about postwar transitions to democracy helps explain the Bush administration's incomprehensible failure to plan adequately for the insurgency that subsequently emerged in Iraq. The war's supporters seemed to think that democracy was a kind of default condition to which societies reverted once the heavy lifting of coercive regime change occurred, rather than a long-term process of institution-building and reform. While they now assert that they knew all along that the democratic transformation of Iraq would be long and hard, they were clearly taken by surprise. According to George Packer's recent book on Iraq, "The Assassins' Gate," the Pentagon planned a drawdown of American forces to some 25,000 troops by the end of the summer following the invasion.
By the 1990's, neoconservatism had been fed by several other intellectual streams. One came from the students of the German Jewish political theorist Leo Strauss, who, contrary to much of the nonsense written about him by people like Anne Norton and Shadia Drury, was a serious reader of philosophical texts who did not express opinions on contemporary politics or policy issues. Rather, he was concerned with the "crisis of modernity" brought on by the relativism of Nietzsche and Heidegger, as well as the fact that neither the claims of religion nor deeply-held opinions about the nature of the good life could be banished from politics, as the thinkers of the European Enlightenment had hoped. Another stream came from Albert Wohlstetter, a Rand Corporation strategist who was the teacher of Richard Perle, Zalmay Khalilzad (the current American ambassador to Iraq) and Paul Wolfowitz (the former deputy secretary of defense), among other people. Wohlstetter was intensely concerned with the problem of nuclear proliferation and the way that the 1968 Nonproliferation Treaty left loopholes, in its support for "peaceful" nuclear energy, large enough for countries like Iraq and Iran to walk through.
I have numerous affiliations with the different strands of the neoconservative movement. I was a student of Strauss's protégé Allan Bloom, who wrote the bestseller "The Closing of the American Mind"; worked at Rand and with Wohlstetter on Persian Gulf issues; and worked also on two occasions for Wolfowitz. Many people have also interpreted my book "The End of History and the Last Man" (1992) as a neoconservative tract, one that argued in favor of the view that there is a universal hunger for liberty in all people that will inevitably lead them to liberal democracy, and that we are living in the midst of an accelerating, transnational movement in favor of that liberal democracy. This is a misreading of the argument. "The End of History" is in the end an argument about modernization. What is initially universal is not the desire for liberal democracy but rather the desire to live in a modern — that is, technologically advanced and prosperous — society, which, if satisfied, tends to drive demands for political participation. Liberal democracy is one of the byproducts of this modernization process, something that becomes a universal aspiration only in the course of historical time.
"The End of History," in other words, presented a kind of Marxist argument for the existence of a long-term process of social evolution, but one that terminates in liberal democracy rather than communism. In the formulation of the scholar Ken Jowitt, the neoconservative position articulated by people like Kristol and Kagan was, by contrast, Leninist; they believed that history can be pushed along with the right application of power and will. Leninism was a tragedy in its Bolshevik version, and it has returned as farce when practiced by the United States. Neoconservatism, as both a political symbol and a body of thought, has evolved into something I can no longer support.
The Failure of Benevolent Hegemony
The Bush administration and its neoconservative supporters did not simply underestimate the difficulty of bringing about congenial political outcomes in places like Iraq; they also misunderstood the way the world would react to the use of American power. Of course, the cold war was replete with instances of what the foreign policy analyst Stephen Sestanovich calls American maximalism, wherein Washington acted first and sought legitimacy and support from its allies only after the fact. But in the post-cold-war period, the structural situation of world politics changed in ways that made this kind of exercise of power much more problematic in the eyes of even close allies. After the fall of the Soviet Union, various neoconservative authors like Charles Krauthammer, William Kristol and Robert Kagan suggested that the United States would use its margin of power to exert a kind of "benevolent hegemony" over the rest of the world, fixing problems like rogue states with W.M.D., human rights abuses and terrorist threats as they came up. Writing before the Iraq war, Kristol and Kagan considered whether this posture would provoke resistance from the rest of the world, and concluded, "It is precisely because American foreign policy is infused with an unusually high degree of morality that other nations find they have less to fear from its otherwise daunting power." (Italics added.)
It is hard to read these lines without irony in the wake of the global reaction to the Iraq war, which succeeded in uniting much of the world in a frenzy of anti-Americanism. The idea that the United States is a hegemon more benevolent than most is not an absurd one, but there were warning signs that things had changed in America's relationship to the world long before the start of the Iraq war. The structural imbalance in global power had grown enormous. America surpassed the rest of the world in every dimension of power by an unprecedented margin, with its defense spending nearly equal to that of the rest of the world combined. Already during the Clinton years, American economic hegemony had generated enormous hostility to an American-dominated process of globalization, frequently on the part of close democratic allies who thought the United States was seeking to impose its antistatist social model on them.
There were other reasons as well why the world did not accept American benevolent hegemony. In the first place, it was premised on American exceptionalism, the idea that America could use its power in instances where others could not because it was more virtuous than other countries. The doctrine of pre-emption against terrorist threats contained in the 2002 National Security Strategy was one that could not safely be generalized through the international system; America would be the first country to object if Russia, China, India or France declared a similar right of unilateral action. The United States was seeking to pass judgment on others while being unwilling to have its own conduct questioned in places like the International Criminal Court.
Another problem with benevolent hegemony was domestic. There are sharp limits to the American people's attention to foreign affairs and willingness to finance projects overseas that do not have clear benefits to American interests. Sept. 11 changed that calculus in many ways, providing popular support for two wars in the Middle East and large increases in defense spending. But the durability of the support is uncertain: although most Americans want to do what is necessary to make the project of rebuilding Iraq succeed, the aftermath of the invasion did not increase the public appetite for further costly interventions. Americans are not, at heart, an imperial people. Even benevolent hegemons sometimes have to act ruthlessly, and they need a staying power that does not come easily to people who are reasonably content with their own lives and society.
Finally, benevolent hegemony presumed that the hegemon was not only well intentioned but competent as well. Much of the criticism of the Iraq intervention from Europeans and others was not based on a normative case that the United States was not getting authorization from the United Nations Security Council, but rather on the belief that it had not made an adequate case for invading Iraq in the first place and didn't know what it was doing in trying to democratize Iraq. In this, the critics were unfortunately quite prescient.
The most basic misjudgment was an overestimation of the threat facing the United States from radical Islamism. Although the new and ominous possibility of undeterrable terrorists armed with weapons of mass destruction did indeed present itself, advocates of the war wrongly conflated this with the threat presented by Iraq and with the rogue state/proliferation problem more generally. The misjudgment was based in part on the massive failure of the American intelligence community to correctly assess the state of Iraq's W.M.D. programs before the war. But the intelligence community never took nearly as alarmist a view of the terrorist/W.M.D. threat as the war's supporters did. Overestimation of this threat was then used to justify the elevation of preventive war to the centerpiece of a new security strategy, as well as a whole series of measures that infringed on civil liberties, from detention policy to domestic eavesdropping.
What to Do
Now that the neoconservative moment appears to have passed, the United States needs to reconceptualize its foreign policy in several fundamental ways. In the first instance, we need to demilitarize what we have been calling the global war on terrorism and shift to other types of policy instruments. We are fighting hot counterinsurgency wars in Afghanistan and Iraq and against the international jihadist movement, wars in which we need to prevail. But "war" is the wrong metaphor for the broader struggle, since wars are fought at full intensity and have clear beginnings and endings. Meeting the jihadist challenge is more of a "long, twilight struggle" whose core is not a military campaign but a political contest for the hearts and minds of ordinary Muslims around the world. As recent events in France and Denmark suggest, Europe will be a central battleground in this fight.
The United States needs to come up with something better than "coalitions of the willing" to legitimate its dealings with other countries. The world today lacks effective international institutions that can confer legitimacy on collective action; creating new organizations that will better balance the dual requirements of legitimacy and effectiveness will be the primary task for the coming generation. As a result of more than 200 years of political evolution, we have a relatively good understanding of how to create institutions that are rulebound, accountable and reasonably effective in the vertical silos we call states. What we do not have are adequate mechanisms of horizontal accountability among states.
The conservative critique of the United Nations is all too cogent: while useful for certain peacekeeping and nation-building operations, the United Nations lacks both democratic legitimacy and effectiveness in dealing with serious security issues. The solution is not to strengthen a single global body, but rather to promote what has been emerging in any event, a "multi-multilateral world" of overlapping and occasionally competing international institutions that are organized on regional or functional lines. Kosovo in 1999 was a model: when the Russian veto prevented the Security Council from acting, the United States and its NATO allies simply shifted the venue to NATO, where the Russians could not block action.
The final area that needs rethinking, and the one that will be the most contested in the coming months and years, is the place of democracy promotion in American foreign policy. The worst legacy that could come from the Iraq war would be an anti-neoconservative backlash that coupled a sharp turn toward isolation with a cynical realist policy aligning the United States with friendly authoritarians. Good governance, which involves not just democracy but also the rule of law and economic development, is critical to a host of outcomes we desire, from alleviating poverty to dealing with pandemics to controlling violent conflicts. A Wilsonian policy that pays attention to how rulers treat their citizens is therefore right, but it needs to be informed by a certain realism that was missing from the thinking of the Bush administration in its first term and of its neoconservative allies.
We need in the first instance to understand that promoting democracy and modernization in the Middle East is not a solution to the problem of jihadist terrorism; in all likelihood it will make the short-term problem worse, as we have seen in the case of the Palestinian election bringing Hamas to power. Radical Islamism is a byproduct of modernization itself, arising from the loss of identity that accompanies the transition to a modern, pluralist society. It is no accident that so many recent terrorists, from Sept. 11's Mohamed Atta to the murderer of the Dutch filmmaker Theo van Gogh to the London subway bombers, were radicalized in democratic Europe and intimately familiar with all of democracy's blessings. More democracy will mean more alienation, radicalization and — yes, unfortunately — terrorism.
But greater political participation by Islamist groups is very likely to occur whatever we do, and it will be the only way that the poison of radical Islamism can ultimately work its way through the body politic of Muslim communities around the world. The age is long since gone when friendly authoritarians could rule over passive populations and produce stability indefinitely. New social actors are mobilizing everywhere, from Bolivia and Venezuela to South Africa and the Persian Gulf. A durable Israeli-Palestinian peace could not be built upon a corrupt, illegitimate Fatah that constantly had to worry about Hamas challenging its authority. Peace might emerge, sometime down the road, from a Palestine run by a formerly radical terrorist group that had been forced to deal with the realities of governing.
If we are serious about the good governance agenda, we have to shift our focus to the reform, reorganization and proper financing of those institutions of the United States government that actually promote democracy, development and the rule of law around the world, organizations like the State Department, U.S.A.I.D., the National Endowment for Democracy and the like. The United States has played an often decisive role in helping along many recent democratic transitions, including in the Philippines in 1986; South Korea and Taiwan in 1987; Chile in 1988; Poland and Hungary in 1989; Serbia in 2000; Georgia in 2003; and Ukraine in 2004-5. But the overarching lesson that emerges from these cases is that the United States does not get to decide when and where democracy comes about. By definition, outsiders can't "impose" democracy on a country that doesn't want it; demand for democracy and reform must be domestic. Democracy promotion is therefore a long-term and opportunistic process that has to await the gradual ripening of political and economic conditions to be effective.
The Bush administration has been walking — indeed, sprinting — away from the legacy of its first term, as evidenced by the cautious multilateral approach it has taken toward the nuclear programs of Iran and North Korea. Condoleezza Rice gave a serious speech in January about "transformational diplomacy" and has begun an effort to reorganize the nonmilitary side of the foreign-policy establishment, and the National Security Strategy document is being rewritten. All of these are welcome changes, but the legacy of the Bush first-term foreign policy and its neoconservative supporters has been so polarizing that it is going to be hard to have a reasoned debate about how to appropriately balance American ideals and interests in the coming years. The reaction against a flawed policy can be as damaging as the policy itself, and such a reaction is an indulgence we cannot afford, given the critical moment we have arrived at in global politics.
Neoconservatism, whatever its complex roots, has become indelibly associated with concepts like coercive regime change, unilateralism and American hegemony. What is needed now are new ideas, neither neoconservative nor realist, for how America is to relate to the rest of the world — ideas that retain the neoconservative belief in the universality of human rights, but without its illusions about the efficacy of American power and hegemony to bring these ends about.
Francis Fukuyama teaches at the School of Advanced International Studies at Johns Hopkins University.
This essay is adapted from his book "America at the Crossroads," which will be published this month by Yale University Press.
By FRANCIS FUKUYAMA
Francis Fukuyama teaches at the School of Advanced International Studies at Johns Hopkins University.
As we approach the third anniversary of the onset of the Iraq war, it seems very unlikely that history will judge either the intervention itself or the ideas animating it kindly. By invading Iraq, the Bush administration created a self-fulfilling prophecy: Iraq has now replaced Afghanistan as a magnet, a training ground and an operational base for jihadist terrorists, with plenty of American targets to shoot at. The United States still has a chance of creating a Shiite-dominated democratic Iraq, but the new government will be very weak for years to come; the resulting power vacuum will invite outside influence from all of Iraq's neighbors, including Iran. There are clear benefits to the Iraqi people from the removal of Saddam Hussein's dictatorship, and perhaps some positive spillover effects in Lebanon and Syria. But it is very hard to see how these developments in themselves justify the blood and treasure that the United States has spent on the project to this point.
The so-called Bush Doctrine that set the framework for the administration's first term is now in shambles. The doctrine (elaborated, among other places, in the 2002 National Security Strategy of the United States) argued that, in the wake of the Sept. 11 attacks, America would have to launch periodic preventive wars to defend itself against rogue states and terrorists with weapons of mass destruction; that it would do this alone, if necessary; and that it would work to democratize the greater Middle East as a long-term solution to the terrorist problem. But successful pre-emption depends on the ability to predict the future accurately and on good intelligence, which was not forthcoming, while America's perceived unilateralism has isolated it as never before. It is not surprising that in its second term, the administration has been distancing itself from these policies and is in the process of rewriting the National Security Strategy document.
But it is the idealistic effort to use American power to promote democracy and human rights abroad that may suffer the greatest setback. Perceived failure in Iraq has restored the authority of foreign policy "realists" in the tradition of Henry Kissinger. Already there is a host of books and articles decrying America's naïve Wilsonianism and attacking the notion of trying to democratize the world. The administration's second-term efforts to push for greater Middle Eastern democracy, introduced with the soaring rhetoric of Bush's second Inaugural Address, have borne very problematic fruits. The Islamist Muslim Brotherhood made a strong showing in Egypt's parliamentary elections in November and December. While the holding of elections in Iraq this past December was an achievement in itself, the vote led to the ascendance of a Shiite bloc with close ties to Iran (following on the election of the conservative Mahmoud Ahmadinejad as president of Iran in June). But the clincher was the decisive Hamas victory in the Palestinian election last month, which brought to power a movement overtly dedicated to the destruction of Israel. In his second inaugural, Bush said that "America's vital interests and our deepest beliefs are now one," but the charge will be made with increasing frequency that the Bush administration made a big mistake when it stirred the pot, and that the United States would have done better to stick by its traditional authoritarian friends in the Middle East. Indeed, the effort to promote democracy around the world has been attacked as an illegitimate activity both by people on the left like Jeffrey Sachs and by traditional conservatives like Pat Buchanan.
The reaction against democracy promotion and an activist foreign policy may not end there. Those whom Walter Russell Mead labels Jacksonian conservatives — red-state Americans whose sons and daughters are fighting and dying in the Middle East — supported the Iraq war because they believed that their children were fighting to defend the United States against nuclear terrorism, not to promote democracy. They don't want to abandon the president in the middle of a vicious war, but down the road the perceived failure of the Iraq intervention may push them to favor a more isolationist foreign policy, which is a more natural political position for them. A recent Pew poll indicates a swing in public opinion toward isolationism; the percentage of Americans saying that the United States "should mind its own business" has never been higher since the end of the Vietnam War.
More than any other group, it was the neoconservatives both inside and outside the Bush administration who pushed for democratizing Iraq and the broader Middle East. They are widely credited (or blamed) for being the decisive voices promoting regime change in Iraq, and yet it is their idealistic agenda that in the coming months and years will be the most directly threatened. Were the United States to retreat from the world stage, following a drawdown in Iraq, it would in my view be a huge tragedy, because American power and influence have been critical to the maintenance of an open and increasingly democratic order around the world. The problem with neoconservatism's agenda lies not in its ends, which are as American as apple pie, but rather in the overmilitarized means by which it has sought to accomplish them. What American foreign policy needs is not a return to a narrow and cynical realism, but rather the formulation of a "realistic Wilsonianism" that better matches means to ends.
The Neoconservative Legacy
How did the neoconservatives end up overreaching to such an extent that they risk undermining their own goals? The Bush administration's first-term foreign policy did not flow ineluctably from the views of earlier generations of people who considered themselves neoconservatives, since those views were themselves complex and subject to differing interpretations. Four common principles or threads ran through much of this thought up through the end of the cold war: a concern with democracy, human rights and, more generally, the internal politics of states; a belief that American power can be used for moral purposes; a skepticism about the ability of international law and institutions to solve serious security problems; and finally, a view that ambitious social engineering often leads to unexpected consequences and thereby undermines its own ends.
The problem was that two of these principles were in potential collision. The skeptical stance toward ambitious social engineering — which in earlier years had been applied mostly to domestic policies like affirmative action, busing and welfare — suggested a cautious approach toward remaking the world and an awareness that ambitious initiatives always have unanticipated consequences. The belief in the potential moral uses of American power, on the other hand, implied that American activism could reshape the structure of global politics. By the time of the Iraq war, the belief in the transformational uses of power had prevailed over the doubts about social engineering.
In retrospect, things did not have to develop this way. The roots of neoconservatism lie in a remarkable group of largely Jewish intellectuals who attended City College of New York (C.C.N.Y.) in the mid- to late 1930's and early 1940's, a group that included Irving Kristol, Daniel Bell, Irving Howe, Nathan Glazer and, a bit later, Daniel Patrick Moynihan. The story of this group has been told in a number of places, most notably in a documentary film by Joseph Dorman called "Arguing the World." The most important inheritance from the C.C.N.Y. group was an idealistic belief in social progress and the universality of rights, coupled with intense anti-Communism.
It is not an accident that many in the C.C.N.Y. group started out as Trotskyites. Leon Trotsky was, of course, himself a Communist, but his supporters came to understand better than most people the utter cynicism and brutality of the Stalinist regime. The anti-Communist left, in contrast to the traditional American right, sympathized with the social and economic aims of Communism, but in the course of the 1930's and 1940's came to realize that "real existing socialism" had become a monstrosity of unintended consequences that completely undermined the idealistic goals it espoused. While not all of the C.C.N.Y. thinkers became neoconservatives, the danger of good intentions carried to extremes was a theme that would underlie the life work of many members of this group.
If there was a single overarching theme to the domestic social policy critiques issued by those who wrote for the neoconservative journal The Public Interest, founded by Irving Kristol, Nathan Glazer and Daniel Bell in 1965, it was the limits of social engineering. Writers like Glazer, Moynihan and, later, Glenn Loury argued that ambitious efforts to seek social justice often left societies worse off than before because they either required massive state intervention that disrupted pre-existing social relations (for example, forced busing) or else produced unanticipated consequences (like an increase in single-parent families as a result of welfare). A major theme running through James Q. Wilson's extensive writings on crime was the idea that you could not lower crime rates by trying to solve deep underlying problems like poverty and racism; effective policies needed to focus on shorter-term measures that went after symptoms of social distress (like subway graffiti or panhandling) rather than root causes.
How, then, did a group with such a pedigree come to decide that the "root cause" of terrorism lay in the Middle East's lack of democracy, that the United States had both the wisdom and the ability to fix this problem and that democracy would come quickly and painlessly to Iraq? Neoconservatives would not have taken this turn but for the peculiar way that the cold war ended.
Ronald Reagan was ridiculed by sophisticated people on the American left and in Europe for labeling the Soviet Union and its allies an "evil empire" and for challenging Mikhail Gorbachev not just to reform his system but also to "tear down this wall." His assistant secretary of defense for international security policy, Richard Perle, was denounced as the "prince of darkness" for this uncompromising, hard-line position; his proposal for a double-zero in the intermediate-range nuclear arms negotiations (that is, the complete elimination of medium-range missiles) was attacked as hopelessly out of touch by the bien-pensant centrist foreign-policy experts at places like the Council on Foreign Relations and the State Department. That community felt that the Reaganites were dangerously utopian in their hopes for actually winning, as opposed to managing, the cold war.
And yet total victory in the cold war is exactly what happened in 1989-91. Gorbachev accepted not only the double zero but also deep cuts in conventional forces, and then failed to stop the Polish, Hungarian and East German defections from the empire. Communism collapsed within a couple of years because of its internal moral weaknesses and contradictions, and with regime change in Eastern Europe and the former Soviet Union, the Warsaw Pact threat to the West evaporated.
The way the cold war ended shaped the thinking of supporters of the Iraq war, including younger neoconservatives like William Kristol and Robert Kagan, in two ways. First, it seems to have created an expectation that all totalitarian regimes were hollow at the core and would crumble with a small push from outside. The model for this was Romania under the Ceausescus: once the wicked witch was dead, the munchkins would rise up and start singing joyously about their liberation. As Kristol and Kagan put it in their 2000 book "Present Dangers": "To many the idea of America using its power to promote changes of regime in nations ruled by dictators rings of utopianism. But in fact, it is eminently realistic. There is something perverse in declaring the impossibility of promoting democratic change abroad in light of the record of the past three decades."
This overoptimism about postwar transitions to democracy helps explain the Bush administration's incomprehensible failure to plan adequately for the insurgency that subsequently emerged in Iraq. The war's supporters seemed to think that democracy was a kind of default condition to which societies reverted once the heavy lifting of coercive regime change occurred, rather than a long-term process of institution-building and reform. While they now assert that they knew all along that the democratic transformation of Iraq would be long and hard, they were clearly taken by surprise. According to George Packer's recent book on Iraq, "The Assassins' Gate," the Pentagon planned a drawdown of American forces to some 25,000 troops by the end of the summer following the invasion.
By the 1990's, neoconservatism had been fed by several other intellectual streams. One came from the students of the German Jewish political theorist Leo Strauss, who, contrary to much of the nonsense written about him by people like Anne Norton and Shadia Drury, was a serious reader of philosophical texts who did not express opinions on contemporary politics or policy issues. Rather, he was concerned with the "crisis of modernity" brought on by the relativism of Nietzsche and Heidegger, as well as the fact that neither the claims of religion nor deeply-held opinions about the nature of the good life could be banished from politics, as the thinkers of the European Enlightenment had hoped. Another stream came from Albert Wohlstetter, a Rand Corporation strategist who was the teacher of Richard Perle, Zalmay Khalilzad (the current American ambassador to Iraq) and Paul Wolfowitz (the former deputy secretary of defense), among other people. Wohlstetter was intensely concerned with the problem of nuclear proliferation and the way that the 1968 Nonproliferation Treaty left loopholes, in its support for "peaceful" nuclear energy, large enough for countries like Iraq and Iran to walk through.
I have numerous affiliations with the different strands of the neoconservative movement. I was a student of Strauss's protégé Allan Bloom, who wrote the bestseller "The Closing of the American Mind"; worked at Rand and with Wohlstetter on Persian Gulf issues; and worked also on two occasions for Wolfowitz. Many people have also interpreted my book "The End of History and the Last Man" (1992) as a neoconservative tract, one that argued in favor of the view that there is a universal hunger for liberty in all people that will inevitably lead them to liberal democracy, and that we are living in the midst of an accelerating, transnational movement in favor of that liberal democracy. This is a misreading of the argument. "The End of History" is in the end an argument about modernization. What is initially universal is not the desire for liberal democracy but rather the desire to live in a modern — that is, technologically advanced and prosperous — society, which, if satisfied, tends to drive demands for political participation. Liberal democracy is one of the byproducts of this modernization process, something that becomes a universal aspiration only in the course of historical time.
"The End of History," in other words, presented a kind of Marxist argument for the existence of a long-term process of social evolution, but one that terminates in liberal democracy rather than communism. In the formulation of the scholar Ken Jowitt, the neoconservative position articulated by people like Kristol and Kagan was, by contrast, Leninist; they believed that history can be pushed along with the right application of power and will. Leninism was a tragedy in its Bolshevik version, and it has returned as farce when practiced by the United States. Neoconservatism, as both a political symbol and a body of thought, has evolved into something I can no longer support.
The Failure of Benevolent Hegemony
The Bush administration and its neoconservative supporters did not simply underestimate the difficulty of bringing about congenial political outcomes in places like Iraq; they also misunderstood the way the world would react to the use of American power. Of course, the cold war was replete with instances of what the foreign policy analyst Stephen Sestanovich calls American maximalism, wherein Washington acted first and sought legitimacy and support from its allies only after the fact. But in the post-cold-war period, the structural situation of world politics changed in ways that made this kind of exercise of power much more problematic in the eyes of even close allies. After the fall of the Soviet Union, various neoconservative authors like Charles Krauthammer, William Kristol and Robert Kagan suggested that the United States would use its margin of power to exert a kind of "benevolent hegemony" over the rest of the world, fixing problems like rogue states with W.M.D., human rights abuses and terrorist threats as they came up. Writing before the Iraq war, Kristol and Kagan considered whether this posture would provoke resistance from the rest of the world, and concluded, "It is precisely because American foreign policy is infused with an unusually high degree of morality that other nations find they have less to fear from its otherwise daunting power." (Italics added.)
It is hard to read these lines without irony in the wake of the global reaction to the Iraq war, which succeeded in uniting much of the world in a frenzy of anti-Americanism. The idea that the United States is a hegemon more benevolent than most is not an absurd one, but there were warning signs that things had changed in America's relationship to the world long before the start of the Iraq war. The structural imbalance in global power had grown enormous. America surpassed the rest of the world in every dimension of power by an unprecedented margin, with its defense spending nearly equal to that of the rest of the world combined. Already during the Clinton years, American economic hegemony had generated enormous hostility to an American-dominated process of globalization, frequently on the part of close democratic allies who thought the United States was seeking to impose its antistatist social model on them.
There were other reasons as well why the world did not accept American benevolent hegemony. In the first place, it was premised on American exceptionalism, the idea that America could use its power in instances where others could not because it was more virtuous than other countries. The doctrine of pre-emption against terrorist threats contained in the 2002 National Security Strategy was one that could not safely be generalized through the international system; America would be the first country to object if Russia, China, India or France declared a similar right of unilateral action. The United States was seeking to pass judgment on others while being unwilling to have its own conduct questioned in places like the International Criminal Court.
Another problem with benevolent hegemony was domestic. There are sharp limits to the American people's attention to foreign affairs and willingness to finance projects overseas that do not have clear benefits to American interests. Sept. 11 changed that calculus in many ways, providing popular support for two wars in the Middle East and large increases in defense spending. But the durability of the support is uncertain: although most Americans want to do what is necessary to make the project of rebuilding Iraq succeed, the aftermath of the invasion did not increase the public appetite for further costly interventions. Americans are not, at heart, an imperial people. Even benevolent hegemons sometimes have to act ruthlessly, and they need a staying power that does not come easily to people who are reasonably content with their own lives and society.
Finally, benevolent hegemony presumed that the hegemon was not only well intentioned but competent as well. Much of the criticism of the Iraq intervention from Europeans and others was not based on a normative case that the United States was not getting authorization from the United Nations Security Council, but rather on the belief that it had not made an adequate case for invading Iraq in the first place and didn't know what it was doing in trying to democratize Iraq. In this, the critics were unfortunately quite prescient.
The most basic misjudgment was an overestimation of the threat facing the United States from radical Islamism. Although the new and ominous possibility of undeterrable terrorists armed with weapons of mass destruction did indeed present itself, advocates of the war wrongly conflated this with the threat presented by Iraq and with the rogue state/proliferation problem more generally. The misjudgment was based in part on the massive failure of the American intelligence community to correctly assess the state of Iraq's W.M.D. programs before the war. But the intelligence community never took nearly as alarmist a view of the terrorist/W.M.D. threat as the war's supporters did. Overestimation of this threat was then used to justify the elevation of preventive war to the centerpiece of a new security strategy, as well as a whole series of measures that infringed on civil liberties, from detention policy to domestic eavesdropping.
What to Do
Now that the neoconservative moment appears to have passed, the United States needs to reconceptualize its foreign policy in several fundamental ways. In the first instance, we need to demilitarize what we have been calling the global war on terrorism and shift to other types of policy instruments. We are fighting hot counterinsurgency wars in Afghanistan and Iraq and against the international jihadist movement, wars in which we need to prevail. But "war" is the wrong metaphor for the broader struggle, since wars are fought at full intensity and have clear beginnings and endings. Meeting the jihadist challenge is more of a "long, twilight struggle" whose core is not a military campaign but a political contest for the hearts and minds of ordinary Muslims around the world. As recent events in France and Denmark suggest, Europe will be a central battleground in this fight.
The United States needs to come up with something better than "coalitions of the willing" to legitimate its dealings with other countries. The world today lacks effective international institutions that can confer legitimacy on collective action; creating new organizations that will better balance the dual requirements of legitimacy and effectiveness will be the primary task for the coming generation. As a result of more than 200 years of political evolution, we have a relatively good understanding of how to create institutions that are rulebound, accountable and reasonably effective in the vertical silos we call states. What we do not have are adequate mechanisms of horizontal accountability among states.
The conservative critique of the United Nations is all too cogent: while useful for certain peacekeeping and nation-building operations, the United Nations lacks both democratic legitimacy and effectiveness in dealing with serious security issues. The solution is not to strengthen a single global body, but rather to promote what has been emerging in any event, a "multi-multilateral world" of overlapping and occasionally competing international institutions that are organized on regional or functional lines. Kosovo in 1999 was a model: when the Russian veto prevented the Security Council from acting, the United States and its NATO allies simply shifted the venue to NATO, where the Russians could not block action.
The final area that needs rethinking, and the one that will be the most contested in the coming months and years, is the place of democracy promotion in American foreign policy. The worst legacy that could come from the Iraq war would be an anti-neoconservative backlash that coupled a sharp turn toward isolation with a cynical realist policy aligning the United States with friendly authoritarians. Good governance, which involves not just democracy but also the rule of law and economic development, is critical to a host of outcomes we desire, from alleviating poverty to dealing with pandemics to controlling violent conflicts. A Wilsonian policy that pays attention to how rulers treat their citizens is therefore right, but it needs to be informed by a certain realism that was missing from the thinking of the Bush administration in its first term and of its neoconservative allies.
We need in the first instance to understand that promoting democracy and modernization in the Middle East is not a solution to the problem of jihadist terrorism; in all likelihood it will make the short-term problem worse, as we have seen in the case of the Palestinian election bringing Hamas to power. Radical Islamism is a byproduct of modernization itself, arising from the loss of identity that accompanies the transition to a modern, pluralist society. It is no accident that so many recent terrorists, from Sept. 11's Mohamed Atta to the murderer of the Dutch filmmaker Theo van Gogh to the London subway bombers, were radicalized in democratic Europe and intimately familiar with all of democracy's blessings. More democracy will mean more alienation, radicalization and — yes, unfortunately — terrorism.
But greater political participation by Islamist groups is very likely to occur whatever we do, and it will be the only way that the poison of radical Islamism can ultimately work its way through the body politic of Muslim communities around the world. The age is long since gone when friendly authoritarians could rule over passive populations and produce stability indefinitely. New social actors are mobilizing everywhere, from Bolivia and Venezuela to South Africa and the Persian Gulf. A durable Israeli-Palestinian peace could not be built upon a corrupt, illegitimate Fatah that constantly had to worry about Hamas challenging its authority. Peace might emerge, sometime down the road, from a Palestine run by a formerly radical terrorist group that had been forced to deal with the realities of governing.
If we are serious about the good governance agenda, we have to shift our focus to the reform, reorganization and proper financing of those institutions of the United States government that actually promote democracy, development and the rule of law around the world, organizations like the State Department, U.S.A.I.D., the National Endowment for Democracy and the like. The United States has played an often decisive role in helping along many recent democratic transitions, including in the Philippines in 1986; South Korea and Taiwan in 1987; Chile in 1988; Poland and Hungary in 1989; Serbia in 2000; Georgia in 2003; and Ukraine in 2004-5. But the overarching lesson that emerges from these cases is that the United States does not get to decide when and where democracy comes about. By definition, outsiders can't "impose" democracy on a country that doesn't want it; demand for democracy and reform must be domestic. Democracy promotion is therefore a long-term and opportunistic process that has to await the gradual ripening of political and economic conditions to be effective.
The Bush administration has been walking — indeed, sprinting — away from the legacy of its first term, as evidenced by the cautious multilateral approach it has taken toward the nuclear programs of Iran and North Korea. Condoleezza Rice gave a serious speech in January about "transformational diplomacy" and has begun an effort to reorganize the nonmilitary side of the foreign-policy establishment, and the National Security Strategy document is being rewritten. All of these are welcome changes, but the legacy of the Bush first-term foreign policy and its neoconservative supporters has been so polarizing that it is going to be hard to have a reasoned debate about how to appropriately balance American ideals and interests in the coming years. The reaction against a flawed policy can be as damaging as the policy itself, and such a reaction is an indulgence we cannot afford, given the critical moment we have arrived at in global politics.
Neoconservatism, whatever its complex roots, has become indelibly associated with concepts like coercive regime change, unilateralism and American hegemony. What is needed now are new ideas, neither neoconservative nor realist, for how America is to relate to the rest of the world — ideas that retain the neoconservative belief in the universality of human rights, but without its illusions about the efficacy of American power and hegemony to bring these ends about.
Francis Fukuyama teaches at the School of Advanced International Studies at Johns Hopkins University.
This essay is adapted from his book "America at the Crossroads," which will be published this month by Yale University Press.
Thursday, February 16
The not-people we're not holding at Guantanamo Bay.
Invisible Men
By Dahlia Lithwick
It's an immutable rule of journalism that when you unearth three instances of a phenomenon, you've got a story. So, you might think three major reports on Guantanamo Bay, all released within a span of two weeks, might constitute a big story. But somehow they do not.
Guantanamo Bay currently holds over 400 prisoners. The Bush administration has repeatedly described these men as "the worst of the worst." Ten have been formally charged with crimes and will someday face military tribunals. The rest wait to learn what they have done wrong. Two major studies conclude that most of them have done very little wrong. A third says they are being tortured while they wait.
No one disputes that the real criminals at Guantanamo should be brought to justice. But now we have proof that most of the prisoners are guilty only of bad luck and that we are casually destroying their lives. The first report was written by Corine Hegland and published two weeks ago in the National Journal. Hegland scrutinized the court documents of 132 prisoners—approximately one-quarter of the detainees—who have filed habeas corpus petitions, as well as the redacted transcripts of the hearings that 314 prisoners have received in appearing before military Combatant Status Review Tribunals—the preliminary screening process that is supposed to ascertain whether they are "enemy combatants," as the Bush administration claims. Hegland's exhaustive review concludes that most of the detainees are not Afghans and that most were not picked up on the battlefield in Afghanistan. The vast majority were instead captured in Pakistan. Seventy-five of the 132 men are not accused of taking part in hostilities against the United States. The data suggests that maybe 80 percent of these detainees were never al-Qaida members, and many were never even Taliban foot soldiers.
Most detainees are being held for the crime of having "associated" with the Taliban or al-Qaida—often in the most attenuated way, including having known or lived with people assumed to be Taliban, or worked for charities with some ties to al-Qaida. Some had "combat" experience that seems to have consisted solely of being hit by U.S. bombs. Most were not picked up by U.S. forces but handed over to our military by Afghan warlords in exchange for enormous bounties and political payback.
But weren't they all proved guilty of something at their status review hearings? Calling these proceedings "hearings" does violence to that word. Detainees are assumed guilty until proven innocent, provided no lawyers, and never told what the evidence against them consists of. That evidence, according to another report by Hegland, often consists of little beyond admissions or accusations by other detainees that follow hundreds of hours of interrogations. (A single prisoner at Guantanamo, following repeated interrogation, accused over 60 of his fellow inmates—or more than 10 percent of the prison's population. Some of his accounts are factual impossibilities.) Another detainee "confessed" following an interminable interrogation, shouting: "Fine, you got me; I'm a terrorist." When the government tried to list this as a confession, his own interrogators were forced to break the outrageous game of telephone and explain it as sarcasm. A Yemeni accused of being a Bin Laden bodyguard eventually "admitted" to having seen Bin Laden five times: "Three times on Al Jazeera and twice on Yemeni news." His file: "Detainee admitted to knowing Osama Bin Laden."
Mark Denbeaux, who teaches law at Seton Hall University in New Jersey, and attorney Joshua Denbeaux published a second report several days after Hegland. They represent two detainees. Their data on the evidence amassed against the entire detainee population jibes with Hegland's. They evaluated written determinations produced by the government for the Combatant Status Review Tribunals; in other words, the government's best case against the prisoners, in the government's own words.
The Seton Hall study found that 55 percent of the detainees are not suspected of having committed any hostile acts against the United States and that 40 percent of the detainees are not affiliated with al-Qaida. Eight percent are listed as having fought for a terrorist group, and 60 percent are merely accused of being "associated with" terrorists—the lowest categorization available. They confirm that 86 percent were captured either by the Northern Alliance or by Pakistan "at a time in which the United States offered large bounties for capture of suspected enemies." They quote a flier, distributed in Afghanistan at the time of the sweeps that reads: "Get wealth and power beyond your dreams ... You can receive millions of dollars helping the anti-Taliban forces catch Al Qaida and Taliban murderers. This is enough money to take care of your family, your tribe, your village for the rest of your life. Pay for livestock and doctors and school books."
While some of the evidence against the detainees appears damning—11 percent are said to have "met with Bin Laden" (I suppose that includes the guy who saw him on TV)—most are accused of "associating with terrorists" based on having met with unnamed individuals, used a guesthouse, owned a Casio watch, or wearing olive drab clothing. Thirty-nine percent possessed a Kalashnikov rifle—almost as fashionable in that part of the world as a Casio. Many were affiliated with groups not on the Department of Homeland Security's Terrorist watch list.
The third report was released today by the U.N. Commission on Human Rights. Five rapporteurs spent 18 months investigating conditions at Guantanamo, based on information provided by released detainees or family members, lawyers, and Defense Department documents. The investigators were not scrutinizing charges. They were assessing humanitarian conditions. They declined to visit the camp itself when they were told they'd be forbidden to meet with the prisoners. Their 41-page document concludes that the government is violating numerous human rights—including the ban on torture and arbitrary detention and the right to a fair trial. The investigators were particularly bothered by reports of violent force-feeding of hunger-strikers and interrogation techniques including prolonged solitary confinement; exposure to extreme temperatures, noise, and light; and forced shaving. It concludes: "The United States government should close the Guantanamo Bay detention facilities without further delay" and recommends the detainees be released or tried.
And why doesn't the government want to put these prisoners on trial? The administration has claimed that it needs these men for their intelligence value; to interrogate them about further 9/11-like plots. But as Hegland reports, by the fall of 2002 it was already common knowledge in the government that "fewer than 10 percent of Guantanamo's prisoners were high-value terrorist operatives," according to Michael Scheuer, who headed the agency's Bin Laden unit from 1999 until he resigned in 2004. Three years later, the government's own documents reveal that hundreds of hours of ruthless questioning have produced only the quasi-comic, quasi-tragic spectacle of weary prisoners beginning to finger one another.
The government's final argument is that we are keeping them from rejoining the war against us, a war that has no end. But that is the most disingenuous claim of all: If any hardened anti-American zealots leave Guantanamo, they will be of our own creation. Nothing will radicalize a man faster than years of imprisonment based on unfounded charges; that's why Abu Ghraib has become the world's foremost crime school. A random sweep of any 500 men in the Middle East right now might turn up dozens sporting olive drab and Casio watches, and dozens more who fiercely hate the United States. Do we propose to detain them all indefinitely and without charges?
The only real justification for the continued disgrace that is Guantanamo is that the government refuses to admit it's made a mistake. Releasing hundreds of prisoners after holding them for four years without charges would be big news. Better, a Guantanamo at which nothing has happened in four years. Better to drain the camp slowly, releasing handfuls of prisoners at a time. Last week, and with little fanfare, seven more detainees were let go. That brings the total number of releasees to 180, with 76 transferred to the custody of other countries. Are these men who are quietly released the "best of the worst"? No. According to the National Journal one detainee, an Australian fundamentalist Muslim, admitted to training several of the 9/11 hijackers and intended to hijack a plane himself. He was released to his home government last year. A Briton said to have targeted 33 Jewish organizations in New York City is similarly gone. Neither faces charges at home.
Guantanamo represents a spectacular failure of every branch of government. Congress is willing to pass a bill stripping courts of habeas-corpus jurisdiction for detainees but unwilling to probe what happens to them. The Supreme Court's decision in Rasul v. Bush conferred seemingly theoretical rights enforceable in theoretical courtrooms. The right to challenge a government detention is older than this country and yet Guantanamo grinds on.
It grinds on because the Bush administration gets exactly what it pays for in that lease: Guantanamo is a not-place. It's neither America nor Cuba. It is peopled by people without names who face no charges. Non-people facing non-trials to defend non-charges are not a story. They are a headache. No wonder the prisoners went on hunger strikes. Not-eating, ironically enough, is the only way they could try to become real to us.
Dahlia Lithwick is a Slate senior editor.
By Dahlia Lithwick
It's an immutable rule of journalism that when you unearth three instances of a phenomenon, you've got a story. So, you might think three major reports on Guantanamo Bay, all released within a span of two weeks, might constitute a big story. But somehow they do not.
Guantanamo Bay currently holds over 400 prisoners. The Bush administration has repeatedly described these men as "the worst of the worst." Ten have been formally charged with crimes and will someday face military tribunals. The rest wait to learn what they have done wrong. Two major studies conclude that most of them have done very little wrong. A third says they are being tortured while they wait.
No one disputes that the real criminals at Guantanamo should be brought to justice. But now we have proof that most of the prisoners are guilty only of bad luck and that we are casually destroying their lives. The first report was written by Corine Hegland and published two weeks ago in the National Journal. Hegland scrutinized the court documents of 132 prisoners—approximately one-quarter of the detainees—who have filed habeas corpus petitions, as well as the redacted transcripts of the hearings that 314 prisoners have received in appearing before military Combatant Status Review Tribunals—the preliminary screening process that is supposed to ascertain whether they are "enemy combatants," as the Bush administration claims. Hegland's exhaustive review concludes that most of the detainees are not Afghans and that most were not picked up on the battlefield in Afghanistan. The vast majority were instead captured in Pakistan. Seventy-five of the 132 men are not accused of taking part in hostilities against the United States. The data suggests that maybe 80 percent of these detainees were never al-Qaida members, and many were never even Taliban foot soldiers.
Most detainees are being held for the crime of having "associated" with the Taliban or al-Qaida—often in the most attenuated way, including having known or lived with people assumed to be Taliban, or worked for charities with some ties to al-Qaida. Some had "combat" experience that seems to have consisted solely of being hit by U.S. bombs. Most were not picked up by U.S. forces but handed over to our military by Afghan warlords in exchange for enormous bounties and political payback.
But weren't they all proved guilty of something at their status review hearings? Calling these proceedings "hearings" does violence to that word. Detainees are assumed guilty until proven innocent, provided no lawyers, and never told what the evidence against them consists of. That evidence, according to another report by Hegland, often consists of little beyond admissions or accusations by other detainees that follow hundreds of hours of interrogations. (A single prisoner at Guantanamo, following repeated interrogation, accused over 60 of his fellow inmates—or more than 10 percent of the prison's population. Some of his accounts are factual impossibilities.) Another detainee "confessed" following an interminable interrogation, shouting: "Fine, you got me; I'm a terrorist." When the government tried to list this as a confession, his own interrogators were forced to break the outrageous game of telephone and explain it as sarcasm. A Yemeni accused of being a Bin Laden bodyguard eventually "admitted" to having seen Bin Laden five times: "Three times on Al Jazeera and twice on Yemeni news." His file: "Detainee admitted to knowing Osama Bin Laden."
Mark Denbeaux, who teaches law at Seton Hall University in New Jersey, and attorney Joshua Denbeaux published a second report several days after Hegland. They represent two detainees. Their data on the evidence amassed against the entire detainee population jibes with Hegland's. They evaluated written determinations produced by the government for the Combatant Status Review Tribunals; in other words, the government's best case against the prisoners, in the government's own words.
The Seton Hall study found that 55 percent of the detainees are not suspected of having committed any hostile acts against the United States and that 40 percent of the detainees are not affiliated with al-Qaida. Eight percent are listed as having fought for a terrorist group, and 60 percent are merely accused of being "associated with" terrorists—the lowest categorization available. They confirm that 86 percent were captured either by the Northern Alliance or by Pakistan "at a time in which the United States offered large bounties for capture of suspected enemies." They quote a flier, distributed in Afghanistan at the time of the sweeps that reads: "Get wealth and power beyond your dreams ... You can receive millions of dollars helping the anti-Taliban forces catch Al Qaida and Taliban murderers. This is enough money to take care of your family, your tribe, your village for the rest of your life. Pay for livestock and doctors and school books."
While some of the evidence against the detainees appears damning—11 percent are said to have "met with Bin Laden" (I suppose that includes the guy who saw him on TV)—most are accused of "associating with terrorists" based on having met with unnamed individuals, used a guesthouse, owned a Casio watch, or wearing olive drab clothing. Thirty-nine percent possessed a Kalashnikov rifle—almost as fashionable in that part of the world as a Casio. Many were affiliated with groups not on the Department of Homeland Security's Terrorist watch list.
The third report was released today by the U.N. Commission on Human Rights. Five rapporteurs spent 18 months investigating conditions at Guantanamo, based on information provided by released detainees or family members, lawyers, and Defense Department documents. The investigators were not scrutinizing charges. They were assessing humanitarian conditions. They declined to visit the camp itself when they were told they'd be forbidden to meet with the prisoners. Their 41-page document concludes that the government is violating numerous human rights—including the ban on torture and arbitrary detention and the right to a fair trial. The investigators were particularly bothered by reports of violent force-feeding of hunger-strikers and interrogation techniques including prolonged solitary confinement; exposure to extreme temperatures, noise, and light; and forced shaving. It concludes: "The United States government should close the Guantanamo Bay detention facilities without further delay" and recommends the detainees be released or tried.
And why doesn't the government want to put these prisoners on trial? The administration has claimed that it needs these men for their intelligence value; to interrogate them about further 9/11-like plots. But as Hegland reports, by the fall of 2002 it was already common knowledge in the government that "fewer than 10 percent of Guantanamo's prisoners were high-value terrorist operatives," according to Michael Scheuer, who headed the agency's Bin Laden unit from 1999 until he resigned in 2004. Three years later, the government's own documents reveal that hundreds of hours of ruthless questioning have produced only the quasi-comic, quasi-tragic spectacle of weary prisoners beginning to finger one another.
The government's final argument is that we are keeping them from rejoining the war against us, a war that has no end. But that is the most disingenuous claim of all: If any hardened anti-American zealots leave Guantanamo, they will be of our own creation. Nothing will radicalize a man faster than years of imprisonment based on unfounded charges; that's why Abu Ghraib has become the world's foremost crime school. A random sweep of any 500 men in the Middle East right now might turn up dozens sporting olive drab and Casio watches, and dozens more who fiercely hate the United States. Do we propose to detain them all indefinitely and without charges?
The only real justification for the continued disgrace that is Guantanamo is that the government refuses to admit it's made a mistake. Releasing hundreds of prisoners after holding them for four years without charges would be big news. Better, a Guantanamo at which nothing has happened in four years. Better to drain the camp slowly, releasing handfuls of prisoners at a time. Last week, and with little fanfare, seven more detainees were let go. That brings the total number of releasees to 180, with 76 transferred to the custody of other countries. Are these men who are quietly released the "best of the worst"? No. According to the National Journal one detainee, an Australian fundamentalist Muslim, admitted to training several of the 9/11 hijackers and intended to hijack a plane himself. He was released to his home government last year. A Briton said to have targeted 33 Jewish organizations in New York City is similarly gone. Neither faces charges at home.
Guantanamo represents a spectacular failure of every branch of government. Congress is willing to pass a bill stripping courts of habeas-corpus jurisdiction for detainees but unwilling to probe what happens to them. The Supreme Court's decision in Rasul v. Bush conferred seemingly theoretical rights enforceable in theoretical courtrooms. The right to challenge a government detention is older than this country and yet Guantanamo grinds on.
It grinds on because the Bush administration gets exactly what it pays for in that lease: Guantanamo is a not-place. It's neither America nor Cuba. It is peopled by people without names who face no charges. Non-people facing non-trials to defend non-charges are not a story. They are a headache. No wonder the prisoners went on hunger strikes. Not-eating, ironically enough, is the only way they could try to become real to us.
Dahlia Lithwick is a Slate senior editor.
The Patent Office as Thought Police
By LORI B. ANDREWS
The boundaries of academic freedom may be vastly circumscribed by the U.S. Supreme Court this term in a case that is not even on most universities' radar. Laboratory Corporation of America Holdings v. Metabolite Laboratories Inc. is not a traditional case of academic freedom involving professors as parties and raising First Amendment concerns. In fact, nobody from a university is a party in this commercial dispute, a patent case between two for-profit laboratories. But at the heart of the case is the essence of campus life: the freedom to think and publish.
The saga began when researchers from Columbia University and the University of Colorado Health Sciences Center developed a test to measure the level of homocysteine, an amino acid, in a patient's body. In research on thousands of people, the investigators learned that a high level of homocysteine is correlated with a vitamin deficiency: low levels of cobalamin or folate.
Other tests for homocysteine existed and were used for a variety of medical disorders. But considering theirs to be an improvement, the researchers applied for a patent. In their application, they also claimed that, because they were the first to recognize that a high level of homocysteine is connected to a vitamin deficiency, they should be allowed to patent that basic physiological fact. Thus they would be owed a royalty anytime anyone used any test for homocysteine and concluded that an elevated level signified a vitamin deficiency. They received U.S. Patent No. 4,940,658 — known as the '658 patent — and later licensed it to Metabolite Laboratories.
Laboratory Corporation of America, called LabCorp, in turn licensed from Metabolite the right to perform the test, and it paid royalties every time it used the patented method. But then Abbott Laboratories developed a homocysteine test that LabCorp considered more efficient; the new test was sufficiently novel that it did not infringe Metabolite's patent. LabCorp began using the new test.
That did not infringe the patent, either. But after LabCorp published an article stating that high homocysteine levels might indicate a vitamin deficiency that could be treated by vitamins, Metabolite sued LabCorp for patent infringement and breach of contract, and was awarded more than $5-million in damages.
LabCorp appealed to the U.S. Court of Appeals for the Federal Circuit, which hears all patent appeals. Astonishingly, it held that LabCorp had induced doctors to infringe the patent by publishing the biological fact that high homocysteine levels indicate vitamin deficiency. The court also ruled that the doctors had directly infringed the patent by merely thinking about the physiological relationship. (Metabolite had not sued the doctors, probably because such lawsuits would have cost more than they would have netted the company and would have produced negative publicity.)
By considering publishing and thinking about a law of nature to be actionable under patent law, the Federal Circuit court has severely threatened academic freedom. Professors everywhere should be concerned about the case, and how the Supreme Court will rule on LabCorp's appeal.
The decision has set off a rush to the patent office to assert ownership over other scientific facts and methods of scientific and medical inquiry. In an amicus brief to the Supreme Court, a group called Patients Not Patents quoted a recent application seeking to patent "a method of evaluating a risk of occurrence of a medical condition in a patient, the method comprising: receiving a patient dataset for the patient; and evaluating the dataset with a model predictive of the medical condition." The holder of such a patent could sue any doctor who was engaging in the practice of medicine, or any researcher who was trying to develop new ways to diagnose or treat an illness.
It has been a quarter-century since the Supreme Court's last landmark patent case, Diamond v. Chakrabarty, which allowed the patenting of genetically engineered bacteria. But that case contains a clue as to how the justices might decide this one. The court said: "The laws of nature, physical phenomena, and abstract ideas have been held not patentable. ... Thus, a new mineral discovered in the earth or a new plant found in the wild is not patentable subject matter. Likewise, Einstein could not patent his celebrated law that E=mc2; nor could Newton have patented the law of gravity. Such discoveries are 'manifestations of ... nature, free to all men and reserved exclusively to none.'"
Further, in a 1948 case, Funk Bros. Seed Co. v. Kalo Inoculant Co., the court considered a patent on the biological properties of certain bacteria used in the inoculation of seeds. While noting that the discovery was "ingenious," the court held that "he who discovers a hitherto unknown phenomenon of nature has no claim to a monopoly of it which the law recognizes."
There are good policy reasons for not granting patents on laws of nature, which the Supreme Court articulated in 1853 in O'Reilly v. Morse.Samuel F.B. Morse had received a patent granting him not only the rights to the telegraph, which he had invented, but also broad rights to a law of nature: the use of electromagnetic waves to write at a distance. The court invalidated those broad rights, saying: "If this claim can be maintained, it matters not by what process or machinery the result is accomplished. ... Some future inventor, in the onward march of science, may discover a mode of writing or printing at a distance by means of the electric or galvanic current, without using any part of the process or combination set forth in the plaintiff's specification. His invention may be less complicated — less liable to get out of order — less expensive in construction, and in its operation. But yet if it is covered by this patent the inventor could not use it, nor the public have the benefit of it without the permission of this patentee."
Upholding the '658 patent would discourage the sharing of scientific information through publication, impede innovation, and give the patentee the type of broad rights to future inventions that troubled the Supreme Court in the Morse case. But it would also threaten medical research in a more fundamental way.
Many patients served as subjects and gave generously of themselves so that researchers could recognize an inherent biological phenomenon — the association between high homocysteine levels and vitamin deficiency. More than 300 of those patients were subjected to full clinical evaluations, including risky neurological evaluations, blood and bone-marrow smears, and repeated serum tests for antibodies over a period of two years. Almost 8,000 patients were studied and tested to discover the phenomenon. Yet neither the patent nor the researchers' publications contain any evidence that the patients were informed that medical information from their bodies would be patented.
If the Supreme Court upholds the '658 patent, it would not be unreasonable for patients in the future to refuse to participate in medical research at academic institutions. Why should people donate their time and subject themselves to potential physical risks if the result is a patent that would ultimately increase health-care costs and deter innovation?
The case was sufficiently troubling to me that I helped a patients' advocacy group, the People's Medical Society, prepare an amicus brief for the Supreme Court. As I worked on it, an old adage popped into my head: What goes around comes around.
University researchers from Columbia University and the University of Colorado were the inventors in the patent. In fact, it was originally assigned to an entity called University Patents, whose goal was to commercialize the discoveries of university professors. But by greedily going beyond their invention — a medical test — to claim rights to a basic fact of human physiology, those professors and their institutions set in motion a patent nightmare that could limit their own academic freedom as well as that of everybody else.
Lori B. Andrews is a professor of law at the Chicago-Kent College of Law at the Illinois Institute of Technology and director of the Institute for Science, Law, and Technology there. Her first novel, Sequence, will be published in June by St. Martin's Press.
The boundaries of academic freedom may be vastly circumscribed by the U.S. Supreme Court this term in a case that is not even on most universities' radar. Laboratory Corporation of America Holdings v. Metabolite Laboratories Inc. is not a traditional case of academic freedom involving professors as parties and raising First Amendment concerns. In fact, nobody from a university is a party in this commercial dispute, a patent case between two for-profit laboratories. But at the heart of the case is the essence of campus life: the freedom to think and publish.
The saga began when researchers from Columbia University and the University of Colorado Health Sciences Center developed a test to measure the level of homocysteine, an amino acid, in a patient's body. In research on thousands of people, the investigators learned that a high level of homocysteine is correlated with a vitamin deficiency: low levels of cobalamin or folate.
Other tests for homocysteine existed and were used for a variety of medical disorders. But considering theirs to be an improvement, the researchers applied for a patent. In their application, they also claimed that, because they were the first to recognize that a high level of homocysteine is connected to a vitamin deficiency, they should be allowed to patent that basic physiological fact. Thus they would be owed a royalty anytime anyone used any test for homocysteine and concluded that an elevated level signified a vitamin deficiency. They received U.S. Patent No. 4,940,658 — known as the '658 patent — and later licensed it to Metabolite Laboratories.
Laboratory Corporation of America, called LabCorp, in turn licensed from Metabolite the right to perform the test, and it paid royalties every time it used the patented method. But then Abbott Laboratories developed a homocysteine test that LabCorp considered more efficient; the new test was sufficiently novel that it did not infringe Metabolite's patent. LabCorp began using the new test.
That did not infringe the patent, either. But after LabCorp published an article stating that high homocysteine levels might indicate a vitamin deficiency that could be treated by vitamins, Metabolite sued LabCorp for patent infringement and breach of contract, and was awarded more than $5-million in damages.
LabCorp appealed to the U.S. Court of Appeals for the Federal Circuit, which hears all patent appeals. Astonishingly, it held that LabCorp had induced doctors to infringe the patent by publishing the biological fact that high homocysteine levels indicate vitamin deficiency. The court also ruled that the doctors had directly infringed the patent by merely thinking about the physiological relationship. (Metabolite had not sued the doctors, probably because such lawsuits would have cost more than they would have netted the company and would have produced negative publicity.)
By considering publishing and thinking about a law of nature to be actionable under patent law, the Federal Circuit court has severely threatened academic freedom. Professors everywhere should be concerned about the case, and how the Supreme Court will rule on LabCorp's appeal.
The decision has set off a rush to the patent office to assert ownership over other scientific facts and methods of scientific and medical inquiry. In an amicus brief to the Supreme Court, a group called Patients Not Patents quoted a recent application seeking to patent "a method of evaluating a risk of occurrence of a medical condition in a patient, the method comprising: receiving a patient dataset for the patient; and evaluating the dataset with a model predictive of the medical condition." The holder of such a patent could sue any doctor who was engaging in the practice of medicine, or any researcher who was trying to develop new ways to diagnose or treat an illness.
It has been a quarter-century since the Supreme Court's last landmark patent case, Diamond v. Chakrabarty, which allowed the patenting of genetically engineered bacteria. But that case contains a clue as to how the justices might decide this one. The court said: "The laws of nature, physical phenomena, and abstract ideas have been held not patentable. ... Thus, a new mineral discovered in the earth or a new plant found in the wild is not patentable subject matter. Likewise, Einstein could not patent his celebrated law that E=mc2; nor could Newton have patented the law of gravity. Such discoveries are 'manifestations of ... nature, free to all men and reserved exclusively to none.'"
Further, in a 1948 case, Funk Bros. Seed Co. v. Kalo Inoculant Co., the court considered a patent on the biological properties of certain bacteria used in the inoculation of seeds. While noting that the discovery was "ingenious," the court held that "he who discovers a hitherto unknown phenomenon of nature has no claim to a monopoly of it which the law recognizes."
There are good policy reasons for not granting patents on laws of nature, which the Supreme Court articulated in 1853 in O'Reilly v. Morse.Samuel F.B. Morse had received a patent granting him not only the rights to the telegraph, which he had invented, but also broad rights to a law of nature: the use of electromagnetic waves to write at a distance. The court invalidated those broad rights, saying: "If this claim can be maintained, it matters not by what process or machinery the result is accomplished. ... Some future inventor, in the onward march of science, may discover a mode of writing or printing at a distance by means of the electric or galvanic current, without using any part of the process or combination set forth in the plaintiff's specification. His invention may be less complicated — less liable to get out of order — less expensive in construction, and in its operation. But yet if it is covered by this patent the inventor could not use it, nor the public have the benefit of it without the permission of this patentee."
Upholding the '658 patent would discourage the sharing of scientific information through publication, impede innovation, and give the patentee the type of broad rights to future inventions that troubled the Supreme Court in the Morse case. But it would also threaten medical research in a more fundamental way.
Many patients served as subjects and gave generously of themselves so that researchers could recognize an inherent biological phenomenon — the association between high homocysteine levels and vitamin deficiency. More than 300 of those patients were subjected to full clinical evaluations, including risky neurological evaluations, blood and bone-marrow smears, and repeated serum tests for antibodies over a period of two years. Almost 8,000 patients were studied and tested to discover the phenomenon. Yet neither the patent nor the researchers' publications contain any evidence that the patients were informed that medical information from their bodies would be patented.
If the Supreme Court upholds the '658 patent, it would not be unreasonable for patients in the future to refuse to participate in medical research at academic institutions. Why should people donate their time and subject themselves to potential physical risks if the result is a patent that would ultimately increase health-care costs and deter innovation?
The case was sufficiently troubling to me that I helped a patients' advocacy group, the People's Medical Society, prepare an amicus brief for the Supreme Court. As I worked on it, an old adage popped into my head: What goes around comes around.
University researchers from Columbia University and the University of Colorado were the inventors in the patent. In fact, it was originally assigned to an entity called University Patents, whose goal was to commercialize the discoveries of university professors. But by greedily going beyond their invention — a medical test — to claim rights to a basic fact of human physiology, those professors and their institutions set in motion a patent nightmare that could limit their own academic freedom as well as that of everybody else.
Lori B. Andrews is a professor of law at the Chicago-Kent College of Law at the Illinois Institute of Technology and director of the Institute for Science, Law, and Technology there. Her first novel, Sequence, will be published in June by St. Martin's Press.