A Secret the Terrorists Already Knew
By RICHARD A. CLARKE and ROGER W. CRESSEY
COUNTERTERRORISM has become a source of continuing domestic and international political controversy. Much of it, like the role of the Iraq war in inspiring new terrorists, deserves analysis and debate. Increasingly, however, many of the political issues surrounding counterterrorism are formulaic, knee-jerk, disingenuous and purely partisan. The current debate about United States monitoring of transfers over the Swift international financial system strikes us as a case of over-reaction by both the Bush administration and its critics.
Going after terrorists' money is a necessary element of any counterterrorism program, as President Bill Clinton pointed out in presidential directives in 1995 and 1998. Individual terrorist attacks do not typically cost very much, but running terrorist cells, networks and organizations can be extremely expensive.
Al Qaeda, Hamas, Hezbollah and other terrorist groups have had significant fund-raising operations involving solicitation of wealthy Muslims, distribution of narcotics and even sales of black market cigarettes in New York. As part of a "follow the money" strategy, monitoring international bank transfers is worthwhile (even if, given the immense number of transactions and the relatively few made by terrorists, it is not highly productive) because it makes operations more difficult for our enemies. It forces them to use more cumbersome means of moving money.
Privacy rights advocates, with whom we generally agree, have lumped this bank-monitoring program with the alleged National Security Agency wiretapping of calls in which at least one party is within the United States as examples of our government violating civil liberties in the name of counterterrorism. The two programs are actually very different.
Any domestic electronic surveillance without a court order, no matter how useful, is clearly illegal. Monitoring international bank transfers, especially with the knowledge of the bank consortium that owns the network, is legal and unobjectionable.
The International Economic Emergency Powers Act, passed in 1977, provides the president with enormous authority over financial transactions by America's enemies. International initiatives against money laundering have been under way for a decade, and have been aimed not only at terrorists but also at drug cartels, corrupt foreign officials and a host of criminal organizations.
These initiatives, combined with treaties and international agreements, should leave no one with any presumption of privacy when moving money electronically between countries. Indeed, since 2001, banks have been obliged to report even transactions entirely within the United States if there is reason to believe illegal activity is involved. Thus we find the privacy and illegality arguments wildly overblown.
So, too, however, are the Bush administration's protests that the press revelations about the financial monitoring program may tip off the terrorists. Administration officials made the same kinds of complaints about news media accounts of electronic surveillance. They want the public to believe that it had not already occurred to every terrorist on the planet that his telephone was probably monitored and his international bank transfers subject to scrutiny. How gullible does the administration take the American citizenry to be?
Terrorists have for many years employed nontraditional communications and money transfers — including the ancient Middle Eastern hawala system, involving couriers and a loosely linked network of money brokers — precisely because they assume that international calls, e-mail and banking are monitored not only by the United States but by Britain, France, Israel, Russia and even many third-world countries.
While this was not news to terrorists, it may, it appears, have been news to some Americans, including some in Congress. But should the press really be called unpatriotic by the administration, and even threatened with prosecution by politicians, for disclosing things the terrorists already assumed?
In the end, all the administration denunciations do is give the press accounts an even higher profile. If administration officials were truly concerned that terrorists might learn something from these reports, they would be wise not to give them further attention by repeatedly fulminating about them.
There is, of course, another possible explanation for all the outraged bloviating. It is an election year. Karl Rove has already said that if it were up to the Democrats, Abu Musab al-Zarqawi would still be alive. The attacks on the press are part of a political effort by administration officials to use terrorism to divide America, and to scare their supporters to the polls again this year.
The administration and its Congressional backers want to give the impression that they are fighting a courageous battle against those who would wittingly or unknowingly help the terrorists. And with four months left before Election Day, we can expect to hear many more outrageous claims about terrorism — from partisans on both sides. By now, sadly, Americans have come to expect it.
Richard A. Clarke and Roger W. Cressey, counterterrorism officials on the National Security Council under Presidents Bill Clinton and George W. Bush, are security consultants.
Thursday, June 29
Court's Ruling Is Likely to Force Negotiations Over Presidential Power
June 30, 2006
News Analysis
By DAVID E. SANGER and SCOTT SHANE
WASHINGTON, June 29 — The Supreme Court's Guantánamo ruling on Thursday was the most significant setback yet for the Bush administration's contention that the Sept. 11 attacks and their aftermath have justified one of the broadest expansions of presidential power in American history.
President Bush and Vice President Dick Cheney spent much of their first term bypassing Congress in the service of what they labeled a "different kind of war." Now they will almost certainly plunge into negotiations they previously spurned, over the extent of the president's powers, this time in the midst of a midterm election in which Mr. Bush's wartime strategies and their consequences have emerged as a potent issue.
The ruling bolsters those in Congress who for months have been trying to force the White House into a retreat from its claims that Mr. Bush not only has the unilateral authority as commander in chief to determine how suspected terrorists are tried, but also to set the rules for domestic wiretapping, for interrogating prisoners and for pursuing a global fight against terror that many suspect could stretch for as long as the cold war did.
What the court's 5-to-3 decision declared, in essence, was that Mr. Bush and Mr. Cheney had overreached and must now either use the established rules of courts-martial or go back to Congress — this time with vastly diminished leverage — to win approval for the military commissions that Mr. Bush argues are the best way to keep the nation safe.
For Mr. Bush, this is not the first such setback. The court ruled two years ago that the giant prison at Guantánamo Bay, Cuba, was not beyond the reach of American courts and that prisoners there had some minimal rights.
Then, last year, came the overwhelming 90-to-9 vote in the Senate, over Mr. Cheney's strong objections, to ban "cruel, inhumane and degrading" treatment of prisoners. That forced Mr. Bush, grudgingly, to reach an accord with Senator John McCain, Republican of Arizona, on principles for interrogation, which are still being turned into rules.
As seen by Mr. Bush's critics, the court has finally reined in an executive who used the Sept. 11 attacks as a justification — or an excuse — to tilt the balance of power decidedly toward the White House.
"This is a great triumph for the rule of law and the separation of powers," said Bruce Ackerman, a professor of law and political science at Yale. "The administration will have to go back to Congress and talk in a much more discriminating fashion about what we need to do."
Some allies of Mr. Bush reacted bitterly on Thursday, asserting that it was the court, rather than Mr. Bush, that had over-reacted.
"Nothing about the administration's solution was radical or even particularly aggressive," said Bradford A. Berenson, who served from 2001 to 2003 as associate White House counsel. "What is truly radical is the Supreme Court's willingness to bend to world opinion and undermine some of the most important foundations of American national security law in the middle of a war."
At least rhetorically, the administration is giving no ground about the reach of the president's powers. Just 10 days ago, speaking here in Washington, Mr. Cheney cited the responses to Watergate and the Vietnam War as examples of where he thought Congress had "begun to encroach up on the power and responsibilities of the president," and said he had come to the White House with the view that "it was important to go back and try to restore that balance."
Since taking office, Mr. Bush and Mr. Cheney have largely tried to do so by fiat, sometimes with public declarations, sometimes with highly classified directives governing how suspects could be plucked from the battlefield or, in the case decided on Thursday, how they would be tried. The president's tone on Thursday, during a news conference with Prime Minister Junichiro Koizumi of Japan, suggested that he recognized he might now have to give ground.
Mr. Bush said he would be taking "the findings" of the Supreme Court "very seriously."
"One thing I'm not going to do, though, is I'm not going to jeopardize the safety of the American people," he said. But then he backtracked a bit, saying that he would "work with Congress" to give legal foundation to the system he had already put in place.
To some degree, the court may have helped Mr. Bush out of a political predicament. He has repeatedly said he would like to close the detention center at Guantánamo, a recognition that the indefinite imprisonment of suspects without trial and the accusations that they have been mistreated were seriously undercutting American credibility abroad. But he set no schedule and said he was waiting for the court to rule.
"The court really rescued the administration by taking it out of this quagmire it's been in," said Michael Greenberger, who teaches the law of counterterrorism at the University of Maryland law school.
Now Congress, with the court's encouragement, may help the president find a way forward. For Senator Lindsey Graham, Republican of South Carolina, who said a legislative proposal on military commissions he sent to the White House 18 months ago "went nowhere," the ruling was a welcome restoration of the balance of power.
"The Supreme Court has set the rules of the road," Mr. Graham, a former military lawyer, said, "and the Congress and the president can drive to the destination together."
Supporters of the president emphasized that the question of how to balance suspects' right against the need for intelligence on imminent attacks was always a daunting challenge, and that the ruling did not change that.
In fact, said Jack Goldsmith, who headed the Justice Department's Office of Legal Counsel in 2003 and 2004, the fact that no second attack has occurred on American soil is an achievement of the administration that is now complicating its political situation.
"The longer the president and the administration successfully prevent another attack," Mr. Goldsmith said, "the more people think the threat has abated and the more they demand that the administration adhere to traditional civil liberties protections."
In today's less panicky national mood, tough measures that few dared question as American forces first moved into Afghanistan, then Iraq, are now the subject of nightly debate on cable television and of a small flotilla of court challenges.
But history suggests that this pendulum swing was inevitable. It took years, but history came to condemn the internment of Japanese-Americans during World War II, and to question Lincoln's suspension of habeas corpus during the Civil War.
Sooner or later, that same reversal was bound to happen to Mr. Bush and Mr. Cheney. The question is how far it will swing back while they are still in office and while what Mr. Bush calls "the long war" continues around the globe.
News Analysis
By DAVID E. SANGER and SCOTT SHANE
WASHINGTON, June 29 — The Supreme Court's Guantánamo ruling on Thursday was the most significant setback yet for the Bush administration's contention that the Sept. 11 attacks and their aftermath have justified one of the broadest expansions of presidential power in American history.
President Bush and Vice President Dick Cheney spent much of their first term bypassing Congress in the service of what they labeled a "different kind of war." Now they will almost certainly plunge into negotiations they previously spurned, over the extent of the president's powers, this time in the midst of a midterm election in which Mr. Bush's wartime strategies and their consequences have emerged as a potent issue.
The ruling bolsters those in Congress who for months have been trying to force the White House into a retreat from its claims that Mr. Bush not only has the unilateral authority as commander in chief to determine how suspected terrorists are tried, but also to set the rules for domestic wiretapping, for interrogating prisoners and for pursuing a global fight against terror that many suspect could stretch for as long as the cold war did.
What the court's 5-to-3 decision declared, in essence, was that Mr. Bush and Mr. Cheney had overreached and must now either use the established rules of courts-martial or go back to Congress — this time with vastly diminished leverage — to win approval for the military commissions that Mr. Bush argues are the best way to keep the nation safe.
For Mr. Bush, this is not the first such setback. The court ruled two years ago that the giant prison at Guantánamo Bay, Cuba, was not beyond the reach of American courts and that prisoners there had some minimal rights.
Then, last year, came the overwhelming 90-to-9 vote in the Senate, over Mr. Cheney's strong objections, to ban "cruel, inhumane and degrading" treatment of prisoners. That forced Mr. Bush, grudgingly, to reach an accord with Senator John McCain, Republican of Arizona, on principles for interrogation, which are still being turned into rules.
As seen by Mr. Bush's critics, the court has finally reined in an executive who used the Sept. 11 attacks as a justification — or an excuse — to tilt the balance of power decidedly toward the White House.
"This is a great triumph for the rule of law and the separation of powers," said Bruce Ackerman, a professor of law and political science at Yale. "The administration will have to go back to Congress and talk in a much more discriminating fashion about what we need to do."
Some allies of Mr. Bush reacted bitterly on Thursday, asserting that it was the court, rather than Mr. Bush, that had over-reacted.
"Nothing about the administration's solution was radical or even particularly aggressive," said Bradford A. Berenson, who served from 2001 to 2003 as associate White House counsel. "What is truly radical is the Supreme Court's willingness to bend to world opinion and undermine some of the most important foundations of American national security law in the middle of a war."
At least rhetorically, the administration is giving no ground about the reach of the president's powers. Just 10 days ago, speaking here in Washington, Mr. Cheney cited the responses to Watergate and the Vietnam War as examples of where he thought Congress had "begun to encroach up on the power and responsibilities of the president," and said he had come to the White House with the view that "it was important to go back and try to restore that balance."
Since taking office, Mr. Bush and Mr. Cheney have largely tried to do so by fiat, sometimes with public declarations, sometimes with highly classified directives governing how suspects could be plucked from the battlefield or, in the case decided on Thursday, how they would be tried. The president's tone on Thursday, during a news conference with Prime Minister Junichiro Koizumi of Japan, suggested that he recognized he might now have to give ground.
Mr. Bush said he would be taking "the findings" of the Supreme Court "very seriously."
"One thing I'm not going to do, though, is I'm not going to jeopardize the safety of the American people," he said. But then he backtracked a bit, saying that he would "work with Congress" to give legal foundation to the system he had already put in place.
To some degree, the court may have helped Mr. Bush out of a political predicament. He has repeatedly said he would like to close the detention center at Guantánamo, a recognition that the indefinite imprisonment of suspects without trial and the accusations that they have been mistreated were seriously undercutting American credibility abroad. But he set no schedule and said he was waiting for the court to rule.
"The court really rescued the administration by taking it out of this quagmire it's been in," said Michael Greenberger, who teaches the law of counterterrorism at the University of Maryland law school.
Now Congress, with the court's encouragement, may help the president find a way forward. For Senator Lindsey Graham, Republican of South Carolina, who said a legislative proposal on military commissions he sent to the White House 18 months ago "went nowhere," the ruling was a welcome restoration of the balance of power.
"The Supreme Court has set the rules of the road," Mr. Graham, a former military lawyer, said, "and the Congress and the president can drive to the destination together."
Supporters of the president emphasized that the question of how to balance suspects' right against the need for intelligence on imminent attacks was always a daunting challenge, and that the ruling did not change that.
In fact, said Jack Goldsmith, who headed the Justice Department's Office of Legal Counsel in 2003 and 2004, the fact that no second attack has occurred on American soil is an achievement of the administration that is now complicating its political situation.
"The longer the president and the administration successfully prevent another attack," Mr. Goldsmith said, "the more people think the threat has abated and the more they demand that the administration adhere to traditional civil liberties protections."
In today's less panicky national mood, tough measures that few dared question as American forces first moved into Afghanistan, then Iraq, are now the subject of nightly debate on cable television and of a small flotilla of court challenges.
But history suggests that this pendulum swing was inevitable. It took years, but history came to condemn the internment of Japanese-Americans during World War II, and to question Lincoln's suspension of habeas corpus during the Civil War.
Sooner or later, that same reversal was bound to happen to Mr. Bush and Mr. Cheney. The question is how far it will swing back while they are still in office and while what Mr. Bush calls "the long war" continues around the globe.
The Subject is Taboo: On the Importance of Genetic Research
I have a clear memory of the moment I learned the word “taboo.” I was 7 or 8, and I had just spent the day walking around a golf course with a great friend of my mother’s (and later, of mine), a man called Tim, and his opponent, a woman called Nora. They had not played each other before. Nora, it transpired, was an excellent golfer. Tim was not. She trounced him. Worse, she didn’t do it from what I learned were called the “ladies’ tees.” She didn’t do it from the “gentlemen’s tees.” She did it from the hardest of all, the “tiger tees.”
I was chatting happily about this, not knowing that women were supposed to lose to men, not knowing that Nora’s tigerish defeat of him was, in Tim’s mind, an exasperating humiliation. I soon found out. As I relived Nora’s victory yet again, Tim leaned over to me and said, “Olivia. The subject is taboo. Do you know what that means?” And he explained.
Looking back, it seems somehow fitting that I learned this word in the context of male versus female performance. For certain subjects in science are taboo — and research into genetic differences in ability or behavior between different groups of people is one of the biggest of all.
The reasons for this are obvious. Some of the most ghastly atrocities of the 20th century were carried out under the banner of the “master race” and nasty pseudoscientific notions about genetic superiority. Sexual and racial discrimination still persist. One legacy of all this is that no one dares look at whether, in humans, differences in anything other than the risk of disease might be influenced by genes. Many geneticists I know are scared — really scared, and with reason — of having their careers ruined if they ask any other questions. Look no further than Lawrence Summers, former president of Harvard University, who was pilloried last year for wondering if mathematical ability in men and women might have some genetic underpinning. A sign has been hung on the door that says “Area Closed to Research.”
Until recently. Research into the genetics behind certain sorts of group differences — skin color, the ability to digest milk, the underpinnings of autism and the like — is now starting to be published. But other subjects remain ferociously contentious. Let me tell you a tale of three papers.
Last September, the journal Science published two papers that claimed natural selection had acted recently and strongly on two human genes involved in brain development. Let’s look at what this claim means.
As you’d expect, a whole slew of genes affect aspects of brain structure in humans. The two featured in the Science papers are among those thought to affect brain growth. People who have two mutant copies of either one of the genes have a condition known as microcephaly — tiny heads and brains. Mental retardation is a consequence.
What do we know about these genes, besides the fact that everyone needs at least one functioning version of each gene for their brain to grow properly? Not much. We may need them for other things as well — both appear to be involved in cell division, for example — but no one knows what exactly. We also know that functioning versions of these genes come in several subtly different forms. Whether these subtle differences matter is unknown. In short, we know (some) of what these genes do when they go wrong, but not much about what they do when they’re normal.
Now, what does it mean to say that natural selection has acted on these genes? As I’ve been discussing over the last 10 days, natural selection happens when organisms carrying particular sets of genes are more successful than those with other sets of genes at surviving and reproducing. Sometimes, natural selection will stop genes from changing — mutant forms of a given gene appear, but fail to spread because they are not helpful. Sometimes, natural selection promotes rapid change: a mutant form of a gene appears and spreads quickly — within a few hundred generations, say. Evidence of a rapid spread — within the last several thousand years — of a new version of each of the two genes is what the Science papers announced.
The papers caused a stir. For the papers also claimed that the new versions of the genes, although present in various populations around the world, were more rare in sub-Saharan Africa than elsewhere. All this means is that, in populations outside Africa, the new forms of the genes may have conferred some sort of advantage — perhaps related to head size, perhaps not — while in Africa, they didn’t. But it didn’t take long for the whisperings to start that the new forms of the genes must be involved in intelligence.
The whispering has no basis: there is no evidence whatsoever that the variants have anything to do with intelligence. The Science papers are, in any case, a first foray into the area; the findings may, or may not, be borne out. But brains, genes and race form an explosive mixture. So much so that — according to a report in The Wall Street Journal last week — the lead scientist on the papers, Bruce Lahn, will now be retiring from working on brain genes.
Meanwhile, another paper has appeared, this time in the March issue of the online journal Public Library of Science Biology. This paper failed to confirm the earlier result. However, the authors found that versions of other genes, also thought to be involved in brain function or structure, have been under recent natural selection in a certain population — and this time, the population is not outside Africa, but in it. (Only one African population was looked at — the Yoruba people of Nigeria. Whether other African populations show the same pattern is unknown.) Again, we have no idea what this means. But strangely, these results have received almost no attention: there has been no whispering this time.
I offer this story as a kind of parable — an illustration of some of the grave difficulties in this field of research. And indeed, the difficulties are myriad. On the scientific side, there’s the problem of trying to figure out what different genes do, how they interact with the environment (this is crucial), and what we can say about our evolutionary past. Then there’s the usual job of interpreting results and of revising the picture as we learn more.
But that’s the least of it. As you can imagine, it is virtually impossible to work in an area as poisonously political as this one. On one side, you have neo-fascist groups twisting the most innocuous data out of shape; on the other, well-intentioned anti-racists who don’t even want the questions asked. Worse still, as the popular success of the “intelligent design” movement shows, it is not always easy to make sure that science is discussed rationally. Result: most geneticists are totally unnerved — and who can blame them?
Perhaps, if open debate is impossible, declaring the area taboo is the best way to proceed. I don’t pretend to have a solution. But here are some thoughts.
First, we know almost nothing about our brains. (Indeed, my bet is that in 100 years, people will look back and laugh about the extent of our ignorance.) If we declare brain genetics out of bounds, it will make it harder to understand how our brains are built. It may make it harder to understand and treat the diseases that affect people’s brains, especially in old age.
Second, the study of human genetics has already illuminated a lot that is interesting and important about our evolutionary past, and how we have come to be. Handled well, this is a tremendously exciting area for research. Do we want to limit it? (Sadly for our vanity, the genes that seem to differ most between humans and chimpanzees are not brain genes, but genes of the immune system and those for making sperm.)
Third, genetic information is pouring in. Questions about the genetics of human differences are not going to go away. On the contrary, they will become more pressing. Scientists have an essential role to play in mediating understanding. Do we really want to scare good scientists from this field? Then the only people left researching it could be those whose agendas genuinely are sinister.
Now that is a frightening thought.
I was chatting happily about this, not knowing that women were supposed to lose to men, not knowing that Nora’s tigerish defeat of him was, in Tim’s mind, an exasperating humiliation. I soon found out. As I relived Nora’s victory yet again, Tim leaned over to me and said, “Olivia. The subject is taboo. Do you know what that means?” And he explained.
Looking back, it seems somehow fitting that I learned this word in the context of male versus female performance. For certain subjects in science are taboo — and research into genetic differences in ability or behavior between different groups of people is one of the biggest of all.
The reasons for this are obvious. Some of the most ghastly atrocities of the 20th century were carried out under the banner of the “master race” and nasty pseudoscientific notions about genetic superiority. Sexual and racial discrimination still persist. One legacy of all this is that no one dares look at whether, in humans, differences in anything other than the risk of disease might be influenced by genes. Many geneticists I know are scared — really scared, and with reason — of having their careers ruined if they ask any other questions. Look no further than Lawrence Summers, former president of Harvard University, who was pilloried last year for wondering if mathematical ability in men and women might have some genetic underpinning. A sign has been hung on the door that says “Area Closed to Research.”
Until recently. Research into the genetics behind certain sorts of group differences — skin color, the ability to digest milk, the underpinnings of autism and the like — is now starting to be published. But other subjects remain ferociously contentious. Let me tell you a tale of three papers.
Last September, the journal Science published two papers that claimed natural selection had acted recently and strongly on two human genes involved in brain development. Let’s look at what this claim means.
As you’d expect, a whole slew of genes affect aspects of brain structure in humans. The two featured in the Science papers are among those thought to affect brain growth. People who have two mutant copies of either one of the genes have a condition known as microcephaly — tiny heads and brains. Mental retardation is a consequence.
What do we know about these genes, besides the fact that everyone needs at least one functioning version of each gene for their brain to grow properly? Not much. We may need them for other things as well — both appear to be involved in cell division, for example — but no one knows what exactly. We also know that functioning versions of these genes come in several subtly different forms. Whether these subtle differences matter is unknown. In short, we know (some) of what these genes do when they go wrong, but not much about what they do when they’re normal.
Now, what does it mean to say that natural selection has acted on these genes? As I’ve been discussing over the last 10 days, natural selection happens when organisms carrying particular sets of genes are more successful than those with other sets of genes at surviving and reproducing. Sometimes, natural selection will stop genes from changing — mutant forms of a given gene appear, but fail to spread because they are not helpful. Sometimes, natural selection promotes rapid change: a mutant form of a gene appears and spreads quickly — within a few hundred generations, say. Evidence of a rapid spread — within the last several thousand years — of a new version of each of the two genes is what the Science papers announced.
The papers caused a stir. For the papers also claimed that the new versions of the genes, although present in various populations around the world, were more rare in sub-Saharan Africa than elsewhere. All this means is that, in populations outside Africa, the new forms of the genes may have conferred some sort of advantage — perhaps related to head size, perhaps not — while in Africa, they didn’t. But it didn’t take long for the whisperings to start that the new forms of the genes must be involved in intelligence.
The whispering has no basis: there is no evidence whatsoever that the variants have anything to do with intelligence. The Science papers are, in any case, a first foray into the area; the findings may, or may not, be borne out. But brains, genes and race form an explosive mixture. So much so that — according to a report in The Wall Street Journal last week — the lead scientist on the papers, Bruce Lahn, will now be retiring from working on brain genes.
Meanwhile, another paper has appeared, this time in the March issue of the online journal Public Library of Science Biology. This paper failed to confirm the earlier result. However, the authors found that versions of other genes, also thought to be involved in brain function or structure, have been under recent natural selection in a certain population — and this time, the population is not outside Africa, but in it. (Only one African population was looked at — the Yoruba people of Nigeria. Whether other African populations show the same pattern is unknown.) Again, we have no idea what this means. But strangely, these results have received almost no attention: there has been no whispering this time.
I offer this story as a kind of parable — an illustration of some of the grave difficulties in this field of research. And indeed, the difficulties are myriad. On the scientific side, there’s the problem of trying to figure out what different genes do, how they interact with the environment (this is crucial), and what we can say about our evolutionary past. Then there’s the usual job of interpreting results and of revising the picture as we learn more.
But that’s the least of it. As you can imagine, it is virtually impossible to work in an area as poisonously political as this one. On one side, you have neo-fascist groups twisting the most innocuous data out of shape; on the other, well-intentioned anti-racists who don’t even want the questions asked. Worse still, as the popular success of the “intelligent design” movement shows, it is not always easy to make sure that science is discussed rationally. Result: most geneticists are totally unnerved — and who can blame them?
Perhaps, if open debate is impossible, declaring the area taboo is the best way to proceed. I don’t pretend to have a solution. But here are some thoughts.
First, we know almost nothing about our brains. (Indeed, my bet is that in 100 years, people will look back and laugh about the extent of our ignorance.) If we declare brain genetics out of bounds, it will make it harder to understand how our brains are built. It may make it harder to understand and treat the diseases that affect people’s brains, especially in old age.
Second, the study of human genetics has already illuminated a lot that is interesting and important about our evolutionary past, and how we have come to be. Handled well, this is a tremendously exciting area for research. Do we want to limit it? (Sadly for our vanity, the genes that seem to differ most between humans and chimpanzees are not brain genes, but genes of the immune system and those for making sperm.)
Third, genetic information is pouring in. Questions about the genetics of human differences are not going to go away. On the contrary, they will become more pressing. Scientists have an essential role to play in mediating understanding. Do we really want to scare good scientists from this field? Then the only people left researching it could be those whose agendas genuinely are sinister.
Now that is a frightening thought.
Wednesday, June 28
Geo-Navigation with Cell Phones - Japan again leading US
June 28, 2006
With a Cellphone as My Guide
By JOHN MARKOFF and MARTIN FACKLER
Think of it as a divining rod for the information age.
If you stand on a street corner in Tokyo today you can point a specialized cellphone at a hotel, a restaurant or a historical monument, and with the press of a button the phone will display information from the Internet describing the object you are looking at.
The new service is made possible by the efforts of three Japanese companies and GeoVector, a small American technology firm, and it represents a missing link between cyberspace and the physical world.
The phones combine satellite-based navigation, precise to within 30 feet or less, with an electronic compass to provide a new dimension of orientation. Connect the device to the Internet and it is possible to overlay the point-and-click simplicity of a computer screen on top of the real world.
The technology is being seen first in Japan because emergency regulations there require cellphones by next year to have receivers using the satellite-based Global Positioning System to establish their location.
In the United States, carriers have the option of a less precise locating technology that calculates a phone's position based on proximity to cellphone towers, a method precise only to within 100 yards or so.
Only two American carriers are using the G.P.S. technology, and none have announced plans to add a compass. As a result, analysts say Japan will have a head start of several years in what many analysts say will be a new frontier for mobile devices.
"People are underestimating the power of geographic search," said Kanwar Chadha, chief executive of Sirf Technology, a Silicon Valley maker of satellite-navigation gear.
The idea came to GeoVector's founders, John Ellenby and his son Thomas, one night in 1991 on a sailboat off the coast of Mexico. To compensate for the elder Mr. Ellenby's poor sense of direction, the two men decided that tying together a compass, a Global Positioning System receiver and binoculars would make it possible simply to point at an object or a navigational landmark to identify it.
Now that vision is taking commercial shape in the Japanese phones, which use software and technology developed by the Ellenbys. The system already provides detailed descriptive information or advertisements about more than 700,000 locations in Japan, relayed to the cellphones over the Internet.
One subscriber, Koichi Matsunuma, walked through the crowds in Tokyo's neon-drenched Shinjuku shopping district on Saturday, eyes locked on his silver cellphone as he weaved down narrow alleys. An arrow on the small screen pointed the way to his destination, a business hotel.
"There it is," said Mr. Matsunuma, a 34-year-old administrative worker at a Tokyo music college. "Now, I just wish this screen would let me make reservations as well."
Mr. Matsunuma showed how it works on a Shinjuku street. He selected "lodgings" on the screen. Then he pointed his phone toward a cluster of tall buildings. A list of hotels in that area popped up, with distances. He chose the closest one, about a quarter-mile away. An arrow appeared to show him the way, and in the upper left corner the number of meters ticked down as he got closer. Another click, and he could see a map showing both his and the hotel's locations.
Mr. Matsunuma said he used the service frequently in unfamiliar neighborhoods. But it came in most handy one day when he was strolling with his wife in a Tokyo park, and he used it on the spur of the moment to find a Southeast Asian restaurant for lunch.
The point-and-click idea could solve one of the most potentially annoying side effects of local wireless advertising. In the movie "Minority Report," as Tom Cruise's character moved through an urban setting, walls that identified him sent a barrage of personally tailored visual advertising. Industry executives are afraid that similar wireless spam may come to plague cellphones and other portable devices in the future.
"It's like getting junk faxes; nobody wants that," said Marc Rotenberg, director of the Electronic Privacy Information Center, a policy group in Washington. "To the degree you are proactive, the more information that is available to you, the more satisfied you are likely to be."
With the GeoVector technology, control is given over to the user, who gets information only from what he or she points at.
The Ellenbys have developed software that makes it possible to add location-based tourism information, advertising, mobile Yellow Pages and entertainment, as well as functions for locating friends. Microsoft was an underwriter of GeoVector development work several years ago.
"We believe we're the holy grail for local search," said Peter Ellenby, another son of John Ellenby and director of new media at GeoVector.
The GeoVector approach is not the only way that location and direction information can be acquired. Currently G.P.S.-based systems use voice commands to supplement on-screen maps in car dashboards, for example. Similarly, many cellphone map systems provide written or spoken directions to users. But the Ellenbys maintain that a built-in compass is a more direct and less confusing way of navigating in urban environments.
The GeoVector service was introduced commercially this year in Japan by KDDI, a cellular carrier, in partnership with NEC Magnus Communications, a networking company, and Mapion, a company that distributes map-based information over the Internet. It is currently available on three handsets from Sony Ericsson.
In addition to a built-in high-tech compass, the service requires pinpoint accuracy available in urban areas only when satellite-based G.P.S. is augmented with terrestrial radio. The new Japanese systems are routinely able to offer accuracy of better than 30 feet even in urban areas where tall buildings frequently obstruct a direct view of the satellites, Mr. Ellenby said. In trials in Tokyo, he said, he had seen accuracies as precise as six feet.
Patrick Bray, a GeoVector representative in Japan, estimated that 1.2 million to 1.5 million of the handsets had been sold. GeoVector and its partners said they did not know how many people were actually using the service, because it is free and available through a public Web site. But they said they planned in September to offer a fee-based premium service, with a bigger database and more detailed maps. Juichi Yamazaki, an assistant manager at NEC Magnus, said the companies expected 200,000 paying users in the first 12 months.
He said the number of users would also rise as other applications using the technology became available. NEC is testing a game that turns cellphones into imaginary fly rods, with users pointing where and how far to cast. Another idea is to help users rearrange their furniture in accordance with feng shui, a traditional Chinese belief in the benefits of letting life forces flow unimpeded through rooms and buildings.
The market in the United States has yet to be developed. Verizon and Sprint Nextel are the only major American carriers that have put G.P.S. receivers in cellphone handsets.
"The main problem is the carriers," said Kenneth L. Dulaney, a wireless industry analyst at the Gartner Group. Although some cellular companies are now offering location-based software applications on handsets, none have taken advantage of the technology's potential, he said, adding, "They don't seem to have any insight."
Sirf Technology, which makes chips that incorporate the satellite receiver and compass into cellphones, said they added less than $10 to the cost of a handset.
Several industry analysts said putting location-based information on cellphones would be a logical step for search engine companies looking for ways to increase advertising revenues. Microsoft has already moved into the cellular handset realm with its Windows Mobile software, and Google is rumored to be working on a Google phone.
According to the market research firm Frost & Sullivan, the American market for location-based applications of all kinds will grow from $90 million last year to about $600 million in 2008.
It is perhaps fitting that the elder Mr. Ellenby, a computer executive at Xerox Palo Alto Research Center in the 1970's, is a pioneer of geolocation technology. In the 1980's he founded Grid Computer, the first maker of light clamshell portable computers, an idea taken from work done by a group of Xerox researchers.
A decade later a Xerox researcher, Mark Weiser, came up with a radically different idea — ubiquitous computing — in which tiny computers disappear into virtually every workaday object to add intelligence to the everyday world. Location-aware cellphones are clearly in that spirit.
With a Cellphone as My Guide
By JOHN MARKOFF and MARTIN FACKLER
Think of it as a divining rod for the information age.
If you stand on a street corner in Tokyo today you can point a specialized cellphone at a hotel, a restaurant or a historical monument, and with the press of a button the phone will display information from the Internet describing the object you are looking at.
The new service is made possible by the efforts of three Japanese companies and GeoVector, a small American technology firm, and it represents a missing link between cyberspace and the physical world.
The phones combine satellite-based navigation, precise to within 30 feet or less, with an electronic compass to provide a new dimension of orientation. Connect the device to the Internet and it is possible to overlay the point-and-click simplicity of a computer screen on top of the real world.
The technology is being seen first in Japan because emergency regulations there require cellphones by next year to have receivers using the satellite-based Global Positioning System to establish their location.
In the United States, carriers have the option of a less precise locating technology that calculates a phone's position based on proximity to cellphone towers, a method precise only to within 100 yards or so.
Only two American carriers are using the G.P.S. technology, and none have announced plans to add a compass. As a result, analysts say Japan will have a head start of several years in what many analysts say will be a new frontier for mobile devices.
"People are underestimating the power of geographic search," said Kanwar Chadha, chief executive of Sirf Technology, a Silicon Valley maker of satellite-navigation gear.
The idea came to GeoVector's founders, John Ellenby and his son Thomas, one night in 1991 on a sailboat off the coast of Mexico. To compensate for the elder Mr. Ellenby's poor sense of direction, the two men decided that tying together a compass, a Global Positioning System receiver and binoculars would make it possible simply to point at an object or a navigational landmark to identify it.
Now that vision is taking commercial shape in the Japanese phones, which use software and technology developed by the Ellenbys. The system already provides detailed descriptive information or advertisements about more than 700,000 locations in Japan, relayed to the cellphones over the Internet.
One subscriber, Koichi Matsunuma, walked through the crowds in Tokyo's neon-drenched Shinjuku shopping district on Saturday, eyes locked on his silver cellphone as he weaved down narrow alleys. An arrow on the small screen pointed the way to his destination, a business hotel.
"There it is," said Mr. Matsunuma, a 34-year-old administrative worker at a Tokyo music college. "Now, I just wish this screen would let me make reservations as well."
Mr. Matsunuma showed how it works on a Shinjuku street. He selected "lodgings" on the screen. Then he pointed his phone toward a cluster of tall buildings. A list of hotels in that area popped up, with distances. He chose the closest one, about a quarter-mile away. An arrow appeared to show him the way, and in the upper left corner the number of meters ticked down as he got closer. Another click, and he could see a map showing both his and the hotel's locations.
Mr. Matsunuma said he used the service frequently in unfamiliar neighborhoods. But it came in most handy one day when he was strolling with his wife in a Tokyo park, and he used it on the spur of the moment to find a Southeast Asian restaurant for lunch.
The point-and-click idea could solve one of the most potentially annoying side effects of local wireless advertising. In the movie "Minority Report," as Tom Cruise's character moved through an urban setting, walls that identified him sent a barrage of personally tailored visual advertising. Industry executives are afraid that similar wireless spam may come to plague cellphones and other portable devices in the future.
"It's like getting junk faxes; nobody wants that," said Marc Rotenberg, director of the Electronic Privacy Information Center, a policy group in Washington. "To the degree you are proactive, the more information that is available to you, the more satisfied you are likely to be."
With the GeoVector technology, control is given over to the user, who gets information only from what he or she points at.
The Ellenbys have developed software that makes it possible to add location-based tourism information, advertising, mobile Yellow Pages and entertainment, as well as functions for locating friends. Microsoft was an underwriter of GeoVector development work several years ago.
"We believe we're the holy grail for local search," said Peter Ellenby, another son of John Ellenby and director of new media at GeoVector.
The GeoVector approach is not the only way that location and direction information can be acquired. Currently G.P.S.-based systems use voice commands to supplement on-screen maps in car dashboards, for example. Similarly, many cellphone map systems provide written or spoken directions to users. But the Ellenbys maintain that a built-in compass is a more direct and less confusing way of navigating in urban environments.
The GeoVector service was introduced commercially this year in Japan by KDDI, a cellular carrier, in partnership with NEC Magnus Communications, a networking company, and Mapion, a company that distributes map-based information over the Internet. It is currently available on three handsets from Sony Ericsson.
In addition to a built-in high-tech compass, the service requires pinpoint accuracy available in urban areas only when satellite-based G.P.S. is augmented with terrestrial radio. The new Japanese systems are routinely able to offer accuracy of better than 30 feet even in urban areas where tall buildings frequently obstruct a direct view of the satellites, Mr. Ellenby said. In trials in Tokyo, he said, he had seen accuracies as precise as six feet.
Patrick Bray, a GeoVector representative in Japan, estimated that 1.2 million to 1.5 million of the handsets had been sold. GeoVector and its partners said they did not know how many people were actually using the service, because it is free and available through a public Web site. But they said they planned in September to offer a fee-based premium service, with a bigger database and more detailed maps. Juichi Yamazaki, an assistant manager at NEC Magnus, said the companies expected 200,000 paying users in the first 12 months.
He said the number of users would also rise as other applications using the technology became available. NEC is testing a game that turns cellphones into imaginary fly rods, with users pointing where and how far to cast. Another idea is to help users rearrange their furniture in accordance with feng shui, a traditional Chinese belief in the benefits of letting life forces flow unimpeded through rooms and buildings.
The market in the United States has yet to be developed. Verizon and Sprint Nextel are the only major American carriers that have put G.P.S. receivers in cellphone handsets.
"The main problem is the carriers," said Kenneth L. Dulaney, a wireless industry analyst at the Gartner Group. Although some cellular companies are now offering location-based software applications on handsets, none have taken advantage of the technology's potential, he said, adding, "They don't seem to have any insight."
Sirf Technology, which makes chips that incorporate the satellite receiver and compass into cellphones, said they added less than $10 to the cost of a handset.
Several industry analysts said putting location-based information on cellphones would be a logical step for search engine companies looking for ways to increase advertising revenues. Microsoft has already moved into the cellular handset realm with its Windows Mobile software, and Google is rumored to be working on a Google phone.
According to the market research firm Frost & Sullivan, the American market for location-based applications of all kinds will grow from $90 million last year to about $600 million in 2008.
It is perhaps fitting that the elder Mr. Ellenby, a computer executive at Xerox Palo Alto Research Center in the 1970's, is a pioneer of geolocation technology. In the 1980's he founded Grid Computer, the first maker of light clamshell portable computers, an idea taken from work done by a group of Xerox researchers.
A decade later a Xerox researcher, Mark Weiser, came up with a radically different idea — ubiquitous computing — in which tiny computers disappear into virtually every workaday object to add intelligence to the everyday world. Location-aware cellphones are clearly in that spirit.
Why Diving Makes Soccer Great
Why Diving Makes Soccer Great
In defense of soccer's biggest villains.
By Austin Kelley
Posted Tuesday, June 27, 2006, at 5:20 PM ET
Italian defender Fabio Grosso fallsIn the final moments of Monday's World Cup match between Italy and Australia, Italian defender Fabio Grosso streaked up the left side toward the goal. He shrugged off one defender who tried to drag him down, then cut into the penalty area. Desperate to stop him, Australia's Lucas Neill slid into his path. Grosso made contact with Neill's prone body and sprawled onto the ground. The referee blew the whistle: penalty.
The Italians made the shot and won the game, 1-0. Fans of the Socceroos are now crying foul, saying Grosso fell on purpose to draw the penalty. After watching numerous replays, I agree that the Italian fell to the ground too easily. The referee still made the right call. This World Cup has inspired a mass of editorial bile about the evils of diving. But diving is not only an integral part of soccer, it's actually good for the sport.
Soccer players fall to the ground without being dragged down or tripped. Once they're on the ground, they often roll around on and exaggerate their injuries. A lot of people hate these theatrics and even characterize them as immoral. Dave Eggers recently said in Slate that flopping is "a combination of acting, lying, begging, and cheating." But diving is far from outright cheating. Rarely do athletes tumble without being touched at all. Usually, they embellish contact to make sure the referee notices a foul, not to deceive him completely.
Obvious, undeserved flops should be punished with yellow cards. Ghana's Asamoah Gyan was justly ejected (on his second yellow) from Tuesday's match with Brazil when he fell without a Brazilian in sight. Such displays serve to discredit more honest, contact-driven divers like Grosso. Diving is like drawing a charge in basketball. When it is done well, it is a subtle (and precarious) art.
Consider the classic matchup between a skilled dribbler and a big, tough defender. The attacker must use his quickness and wit to get by. The bigger man, though, can always resort to a "professional foul"—an intentional foul in which there is no attempt to play the ball. The defender will give away a free kick, but that will hurt only in certain parts of the field. So, what is the attacker to do? If he finds a flailing leg in his way, he can do nothing except barge right into it. And maybe writhe around on the ground for a bit, encouraging the referee to hand out a card, thus discouraging the brutish defender from trying such rough tactics in the future.
Far from being a sign of corruption, diving is, in certain ways, a civilizing influence. Divers are usually quicker, smaller players. As athletes get bigger and stronger, the little guy gets nudged aside. If professional fouls and brute force reign supreme, creative play and joyful improvisation will suffer.
FIFA doesn't see it this way. Prior to this year's World Cup, the organization issued special instructions to referees to crack down on "simulation." This misguided initiative has failed miserably. We've had more flopping than ever despite a record number of yellow cards and ejections.
Referees are partly responsible for the culture of diving. Officials are much more likely to blow their whistles when they see a few somersaults, and players know they might not get a call if they stay on their feet. But in soccer there is only one main referee patrolling a pitch that measures up to 120 by 80 yards. He cannot see everything, and diving is particularly hard to discern. Even with the benefit of slow-motion replays, it's sometimes difficult to tell a flop from a "natural" fall.
The scorn heaped on divers usually doesn't have to do with the logistics of refereeing, though. In reality, it's distaste for the spectacle. American sports are loaded with comic set pieces—a hockey player tossing his gloves for a ceremonial tussle or a baseball manager kicking dirt at the umpire. Like tumbling soccer players, these performers act to provoke sympathy or indignation. The difference is in the style of emotional drama.
In most American sports, the theatrics are aggressive. They are not operatic displays of vulnerability. To appreciate diving, we must sympathize or scorn the injured player—we must get into the melodrama. Some fans are afraid to take the plunge, preferring to argue that diving makes soccer players seem like babies or, worse still, women. (Former England striker Gary Lineker has called for a special "pink card" to be shown to divers.) Their distaste for the dive is rooted in an idea of masculinity, not in an analysis of the game itself. That idea of masculinity is preventing them from enjoying a pretty good show.
The other most pervasive critique of diving is a nationalist one. Depending on who you talk to, Sunday's flop-heavy, four-red-card debacle between the Netherlands and Portugal was the fault of either Iberian gamesmanship or Dutch fakery. For Anglo-American commentators, crusades against floppers are often laced with a distrust of wily, olive-skinned outsiders. In March, the London Times initiated a campaign to "kick out the cheats." Playacting was said to have infiltrated English soccer from outside. "It's crept into our game lately, but it is a foreign thing," Alan Stubbs, an Everton defender, recently remarked. "They speak good English, it's not as if they don't understand what they're doing."
Whether or not you must know English to understand what you're doing, diving is hardly a recent conspiracy cooked up in southern climes. Reports of flopping go back to the early days of the sport, and—surprise!—Brits have been influential in its development. Manchester City striker Francis Lee, for example, was one of the first great divers of the television era. He won theatrical penalties in the 1960s and 1970s, long before the famed Argentine flopper Diego Simeone took his first fall. Fans who champion the "fair play" and the "work ethic" of traditional English soccer tend to overlook the dives of skilled English players like Michael Owen.
There is nothing more depressing than a player who goes to the ground when he might have scored. Ronaldinho and Thierry Henry, arguably the world's best players, will stay on their feet at all cost for the sake of a beautiful pass or a brilliant run at the goal. But the next time you see an artful dribbler derailed by a clumsy oaf, take a minute to think about whose side you're on. Doesn't the dribbler deserve a somersault or two to remind the world that the only way to stop him is through violent and graceless means?
In defense of soccer's biggest villains.
By Austin Kelley
Posted Tuesday, June 27, 2006, at 5:20 PM ET
Italian defender Fabio Grosso fallsIn the final moments of Monday's World Cup match between Italy and Australia, Italian defender Fabio Grosso streaked up the left side toward the goal. He shrugged off one defender who tried to drag him down, then cut into the penalty area. Desperate to stop him, Australia's Lucas Neill slid into his path. Grosso made contact with Neill's prone body and sprawled onto the ground. The referee blew the whistle: penalty.
The Italians made the shot and won the game, 1-0. Fans of the Socceroos are now crying foul, saying Grosso fell on purpose to draw the penalty. After watching numerous replays, I agree that the Italian fell to the ground too easily. The referee still made the right call. This World Cup has inspired a mass of editorial bile about the evils of diving. But diving is not only an integral part of soccer, it's actually good for the sport.
Soccer players fall to the ground without being dragged down or tripped. Once they're on the ground, they often roll around on and exaggerate their injuries. A lot of people hate these theatrics and even characterize them as immoral. Dave Eggers recently said in Slate that flopping is "a combination of acting, lying, begging, and cheating." But diving is far from outright cheating. Rarely do athletes tumble without being touched at all. Usually, they embellish contact to make sure the referee notices a foul, not to deceive him completely.
Obvious, undeserved flops should be punished with yellow cards. Ghana's Asamoah Gyan was justly ejected (on his second yellow) from Tuesday's match with Brazil when he fell without a Brazilian in sight. Such displays serve to discredit more honest, contact-driven divers like Grosso. Diving is like drawing a charge in basketball. When it is done well, it is a subtle (and precarious) art.
Consider the classic matchup between a skilled dribbler and a big, tough defender. The attacker must use his quickness and wit to get by. The bigger man, though, can always resort to a "professional foul"—an intentional foul in which there is no attempt to play the ball. The defender will give away a free kick, but that will hurt only in certain parts of the field. So, what is the attacker to do? If he finds a flailing leg in his way, he can do nothing except barge right into it. And maybe writhe around on the ground for a bit, encouraging the referee to hand out a card, thus discouraging the brutish defender from trying such rough tactics in the future.
Far from being a sign of corruption, diving is, in certain ways, a civilizing influence. Divers are usually quicker, smaller players. As athletes get bigger and stronger, the little guy gets nudged aside. If professional fouls and brute force reign supreme, creative play and joyful improvisation will suffer.
FIFA doesn't see it this way. Prior to this year's World Cup, the organization issued special instructions to referees to crack down on "simulation." This misguided initiative has failed miserably. We've had more flopping than ever despite a record number of yellow cards and ejections.
Referees are partly responsible for the culture of diving. Officials are much more likely to blow their whistles when they see a few somersaults, and players know they might not get a call if they stay on their feet. But in soccer there is only one main referee patrolling a pitch that measures up to 120 by 80 yards. He cannot see everything, and diving is particularly hard to discern. Even with the benefit of slow-motion replays, it's sometimes difficult to tell a flop from a "natural" fall.
The scorn heaped on divers usually doesn't have to do with the logistics of refereeing, though. In reality, it's distaste for the spectacle. American sports are loaded with comic set pieces—a hockey player tossing his gloves for a ceremonial tussle or a baseball manager kicking dirt at the umpire. Like tumbling soccer players, these performers act to provoke sympathy or indignation. The difference is in the style of emotional drama.
In most American sports, the theatrics are aggressive. They are not operatic displays of vulnerability. To appreciate diving, we must sympathize or scorn the injured player—we must get into the melodrama. Some fans are afraid to take the plunge, preferring to argue that diving makes soccer players seem like babies or, worse still, women. (Former England striker Gary Lineker has called for a special "pink card" to be shown to divers.) Their distaste for the dive is rooted in an idea of masculinity, not in an analysis of the game itself. That idea of masculinity is preventing them from enjoying a pretty good show.
The other most pervasive critique of diving is a nationalist one. Depending on who you talk to, Sunday's flop-heavy, four-red-card debacle between the Netherlands and Portugal was the fault of either Iberian gamesmanship or Dutch fakery. For Anglo-American commentators, crusades against floppers are often laced with a distrust of wily, olive-skinned outsiders. In March, the London Times initiated a campaign to "kick out the cheats." Playacting was said to have infiltrated English soccer from outside. "It's crept into our game lately, but it is a foreign thing," Alan Stubbs, an Everton defender, recently remarked. "They speak good English, it's not as if they don't understand what they're doing."
Whether or not you must know English to understand what you're doing, diving is hardly a recent conspiracy cooked up in southern climes. Reports of flopping go back to the early days of the sport, and—surprise!—Brits have been influential in its development. Manchester City striker Francis Lee, for example, was one of the first great divers of the television era. He won theatrical penalties in the 1960s and 1970s, long before the famed Argentine flopper Diego Simeone took his first fall. Fans who champion the "fair play" and the "work ethic" of traditional English soccer tend to overlook the dives of skilled English players like Michael Owen.
There is nothing more depressing than a player who goes to the ground when he might have scored. Ronaldinho and Thierry Henry, arguably the world's best players, will stay on their feet at all cost for the sake of a beautiful pass or a brilliant run at the goal. But the next time you see an artful dribbler derailed by a clumsy oaf, take a minute to think about whose side you're on. Doesn't the dribbler deserve a somersault or two to remind the world that the only way to stop him is through violent and graceless means?
The myth of the hands-off conservative jurist.
Fair to Meddling
The myth of the hands-off conservative jurist.
By Seth Rosenthal
Posted Tuesday, June 27, 2006, at 2:28 PM ET
As the Supreme Court's term nears its conclusion, columnist George Will has asserted that the John Roberts and Samuel Alito confirmation debates were all about preventing "the nation's courts [from being pulled] even more deeply than they already are into supervising American life." The implication is that those who championed the recent nominations believe in a limited role for the courts, while those with reservations idealize an expansive one.
This characterization, taken directly from the right wing's playbook, sounds nice and neat. There's only one problem: It isn't true.
To be sure, conservatives have long vituperated against certain decisions that entangle courts in societal governance—from now-canonized Warren Court decisions desegregating public schools, ensuring "one person-one vote," and guaranteeing the rights of criminal defendants, to still-controversial decisions enforcing church-state separation, preserving abortion rights, and invalidating laws that single out gays for punitive treatment. And to be sure, liberals (and many moderates) generally celebrate those same rulings.
But contrary to what Will suggests, that's just not the end of the story. Though you'll rarely hear them admit it, today's movement conservatives do embrace muscular courts that "supervise American life," often in the very same cases in which liberals want courts to take a hands-off approach. The most fervent Roberts and Alito supporters would use the power of judicial review to wipe out or weaken land-use regulations, campaign-finance reform, affirmative action, and gun control. Perhaps more significant, they cheered—and hope that the additions of Alito and Roberts re-invigorate—the Rehnquist Court's recently slowed assault on Congress' legislative authority. Generating the highest-ever annual rate of invalidating federal legislation, the "hands-off" Rehnquist Court scotched laws safeguarding workers, seniors, people with disabilities, school children, and religious minorities and established standards threatening to scuttle even more, including important environmental achievements.
Among other things, these rulings provide that when deciding cases in which state government officials stand accused of violating a federal antidiscrimination law (such as the Americans with Disabilities Act), courts must strike the law down unless they determine that it is a "congruent and proportional" response to a demonstrable history of state-sponsored discrimination. Another landmark conservative favorite brushed aside a "mountain of data"—four years of fact-finding, studies from task forces in 21 states, and eight different congressional reports—to condemn as unconstitutional the "method of reasoning" Congress employed to enact legislation that would have protected women against violence. Talk about supervising American life.
And that's just what your run-of-the-mill movement conservative supports: Those on the right wing's fringes look to use the Constitution to undo scores of state and federal laws protecting workers, consumers, and public health. Just like the Supreme Court did in the early 20th century, when it infamously wiped out all manner of social welfare legislation, including laws prohibiting child labor, setting a minimum wage, and regulating maximum work hours. New books by pundits Mark Smith and Andrew Napolitano advocate this cause with swagger.
The truth, then, is that despite all their fulminating about judicial activism, conservatives today firmly believe that courts must step in to oversee, correct, or invalidate the actions of government officials. They simply disagree with liberals on when to do it.
And that is the real debate—the debate over "when"—we should be having. Not the tiresome mudslinging about who's an "activist" and who's "restrained," but rather an open, honest discussion about when the exercise of judicial power is justified and when it isn't.
For conservatives, as the above examples show, judicial intervention is fine when it means slicing up labor, consumer, civil rights, and environmental regulations intended to curb the potential excesses of laissez-faire economic policy. It isn't fine when it means enforcing civil rights and civil liberties. Liberals take pretty much the opposite view. This is, of course, a crude generalization, but the exceptions—conservatives (like liberals) want courts to protect the freedom of religious minorities, liberals (like conservatives) want courts to scrutinize over-aggressive eminent domain practices—don't put much of a dent in it.
Two Supreme Court cases decided in the past weeks illustrate the point. In the case that prompted Will's column, the high court ruled, by a vote of 5-4, that the First Amendment offers no protection against retaliation for public employees who, in the course of their duties, blow the whistle on government waste, fraud, or corruption. Roberts and Alito joined the majority. Conservatives like Will applauded the court's narrowing of judicially enforceable liberties because, they say, it frees the hands of government employers to maintain a disciplined workforce. Liberals, on the other hand, criticized the take-a-powder diktat to lower courts, saying it discourages whistle-blowing and curbs the free-speech rights of the nation's 21 million government workers.
Last week the tables turned in an important case that determined whether the Clean Water Act applies to environmentally sensitive wetlands on private property. According to a brief filed by former members of Congress who enacted the law, it does, and federal regulators have treated it as such for 30 years. The states liked it that way, too. Liberals wanted the court to back off and allow this pro-environmental-protection consensus to hold. Many conservatives, however, wanted the court to meddle, arguing that the Clean Water Act doesn't really cover many of the wetlands that the legislative-regulatory consensus said it did. Much to their delight, Roberts and Alito joined Justices Scalia and Thomas in adopting this intrusive approach.
And there, in a nutshell, is what simultaneously animated liberal concern and conservative giddiness over Roberts and Alito last fall and winter: It was not simply that these two new justices might pull the federal judiciary away from enforcing legal rights, as in the whistle-blower case. It was also—perhaps even more—that, as in the wetlands case, they would forcefully insert the judiciary into areas of governance where, for much of the past 75 years, the courts have trod very carefully.
The pretense that conservatives favor shrunken courts while liberals favor aggrandized ones obviously plays to a sizable chunk of the Republican base, which in fact wants courts to exit the arena on the issues it cares about most deeply—abortion, gays, and religion in the public square. But the pretense also provides a useful cover for many legal views that just aren't that popular. After all, as compared to the liberal vision of judicial power, the conservative vision typically advantages corporations, developers, state governmental institutions, and monied political interests. Not exactly a winning message.
When they get past the rhetoric, conservatives contend that these outcomes, which usually privilege the rich and powerful, rest on solid legal foundations. Fair enough. The public would benefit from a debate about the merits of such arguments. What the public doesn't benefit from is the charade that these outcomes are compelled by some unwavering commitment to a "limited" role for the courts.
Seth Rosenthal, a former federal prosecutor, is legal director of Alliance for Justice.
The myth of the hands-off conservative jurist.
By Seth Rosenthal
Posted Tuesday, June 27, 2006, at 2:28 PM ET
As the Supreme Court's term nears its conclusion, columnist George Will has asserted that the John Roberts and Samuel Alito confirmation debates were all about preventing "the nation's courts [from being pulled] even more deeply than they already are into supervising American life." The implication is that those who championed the recent nominations believe in a limited role for the courts, while those with reservations idealize an expansive one.
This characterization, taken directly from the right wing's playbook, sounds nice and neat. There's only one problem: It isn't true.
To be sure, conservatives have long vituperated against certain decisions that entangle courts in societal governance—from now-canonized Warren Court decisions desegregating public schools, ensuring "one person-one vote," and guaranteeing the rights of criminal defendants, to still-controversial decisions enforcing church-state separation, preserving abortion rights, and invalidating laws that single out gays for punitive treatment. And to be sure, liberals (and many moderates) generally celebrate those same rulings.
But contrary to what Will suggests, that's just not the end of the story. Though you'll rarely hear them admit it, today's movement conservatives do embrace muscular courts that "supervise American life," often in the very same cases in which liberals want courts to take a hands-off approach. The most fervent Roberts and Alito supporters would use the power of judicial review to wipe out or weaken land-use regulations, campaign-finance reform, affirmative action, and gun control. Perhaps more significant, they cheered—and hope that the additions of Alito and Roberts re-invigorate—the Rehnquist Court's recently slowed assault on Congress' legislative authority. Generating the highest-ever annual rate of invalidating federal legislation, the "hands-off" Rehnquist Court scotched laws safeguarding workers, seniors, people with disabilities, school children, and religious minorities and established standards threatening to scuttle even more, including important environmental achievements.
Among other things, these rulings provide that when deciding cases in which state government officials stand accused of violating a federal antidiscrimination law (such as the Americans with Disabilities Act), courts must strike the law down unless they determine that it is a "congruent and proportional" response to a demonstrable history of state-sponsored discrimination. Another landmark conservative favorite brushed aside a "mountain of data"—four years of fact-finding, studies from task forces in 21 states, and eight different congressional reports—to condemn as unconstitutional the "method of reasoning" Congress employed to enact legislation that would have protected women against violence. Talk about supervising American life.
And that's just what your run-of-the-mill movement conservative supports: Those on the right wing's fringes look to use the Constitution to undo scores of state and federal laws protecting workers, consumers, and public health. Just like the Supreme Court did in the early 20th century, when it infamously wiped out all manner of social welfare legislation, including laws prohibiting child labor, setting a minimum wage, and regulating maximum work hours. New books by pundits Mark Smith and Andrew Napolitano advocate this cause with swagger.
The truth, then, is that despite all their fulminating about judicial activism, conservatives today firmly believe that courts must step in to oversee, correct, or invalidate the actions of government officials. They simply disagree with liberals on when to do it.
And that is the real debate—the debate over "when"—we should be having. Not the tiresome mudslinging about who's an "activist" and who's "restrained," but rather an open, honest discussion about when the exercise of judicial power is justified and when it isn't.
For conservatives, as the above examples show, judicial intervention is fine when it means slicing up labor, consumer, civil rights, and environmental regulations intended to curb the potential excesses of laissez-faire economic policy. It isn't fine when it means enforcing civil rights and civil liberties. Liberals take pretty much the opposite view. This is, of course, a crude generalization, but the exceptions—conservatives (like liberals) want courts to protect the freedom of religious minorities, liberals (like conservatives) want courts to scrutinize over-aggressive eminent domain practices—don't put much of a dent in it.
Two Supreme Court cases decided in the past weeks illustrate the point. In the case that prompted Will's column, the high court ruled, by a vote of 5-4, that the First Amendment offers no protection against retaliation for public employees who, in the course of their duties, blow the whistle on government waste, fraud, or corruption. Roberts and Alito joined the majority. Conservatives like Will applauded the court's narrowing of judicially enforceable liberties because, they say, it frees the hands of government employers to maintain a disciplined workforce. Liberals, on the other hand, criticized the take-a-powder diktat to lower courts, saying it discourages whistle-blowing and curbs the free-speech rights of the nation's 21 million government workers.
Last week the tables turned in an important case that determined whether the Clean Water Act applies to environmentally sensitive wetlands on private property. According to a brief filed by former members of Congress who enacted the law, it does, and federal regulators have treated it as such for 30 years. The states liked it that way, too. Liberals wanted the court to back off and allow this pro-environmental-protection consensus to hold. Many conservatives, however, wanted the court to meddle, arguing that the Clean Water Act doesn't really cover many of the wetlands that the legislative-regulatory consensus said it did. Much to their delight, Roberts and Alito joined Justices Scalia and Thomas in adopting this intrusive approach.
And there, in a nutshell, is what simultaneously animated liberal concern and conservative giddiness over Roberts and Alito last fall and winter: It was not simply that these two new justices might pull the federal judiciary away from enforcing legal rights, as in the whistle-blower case. It was also—perhaps even more—that, as in the wetlands case, they would forcefully insert the judiciary into areas of governance where, for much of the past 75 years, the courts have trod very carefully.
The pretense that conservatives favor shrunken courts while liberals favor aggrandized ones obviously plays to a sizable chunk of the Republican base, which in fact wants courts to exit the arena on the issues it cares about most deeply—abortion, gays, and religion in the public square. But the pretense also provides a useful cover for many legal views that just aren't that popular. After all, as compared to the liberal vision of judicial power, the conservative vision typically advantages corporations, developers, state governmental institutions, and monied political interests. Not exactly a winning message.
When they get past the rhetoric, conservatives contend that these outcomes, which usually privilege the rich and powerful, rest on solid legal foundations. Fair enough. The public would benefit from a debate about the merits of such arguments. What the public doesn't benefit from is the charade that these outcomes are compelled by some unwavering commitment to a "limited" role for the courts.
Seth Rosenthal, a former federal prosecutor, is legal director of Alliance for Justice.
Tuesday, June 27
Remembering Richard Hofstadter
By CARLIN ROMANO
Years ago in Albuquerque, at a conference rich in themes of American Indian philosophy and the Southwest's Spanish legacy, a local journalist tossed a thought at me that I found epiphanic in its elegant yet caustic common sense.
"The difference between the Eastern establishment and us is really simple and geographic," he said, more or less. "You think American history moves from right to left. We think it moves from left to right, except all those folks on the right started heading in our direction. You read American history like Hebrew. We read it like, if you will, Spanish!"
I'd spent two decades appreciating nuances of socially constructed European philosophy. Hypercritical literary attitudes toward theory came naturally. But I realized at that hotel cafe, with internal shamefacedness, that I'd now caught how even the most basic engine of American history lies in the eye of the Americanist, professional or not.
My opaqueness presumably began in experiencing grade-school civics before the American Revolution became a subfield of "Insurrections in Atlantic Civilization," before American exceptionalism yielded to "We Ain't Nothin' Special"-ism, a time when Crispus Attucks and Phillis Wheatley satisfied the faint impulse toward tokenism. For intellectuals as for everyone else, the hardest feat is to break free of the standard history of one's country and religion. We absorb it at an age when critical skills remain weak, when our vulnerability to "natural" truths is at its peak. As Richard Rorty once remarked, most of us have cartoon versions of history and philosophy in our heads, though their hold on us differs.
Eventually, most wised-up readers of history come to agree with the advice of E.H. Carr, cited and honored by David S. Brown, that "Before you study the history, study the historian." The payoff of Brown's effort comes in Richard Hofstadter: An Intellectual Biography (University of Chicago, 2006), an incisive interpretive profile.
In choosing Hofstadter (1916-70) to explore Carr's rubric, Brown, associate professor of history at Elizabethtown College, fixes on a man who occupied a position continually rewarded by America's intellectual establishment, but not often scrutinized: King of American History. It comes with a chair at a prestigious and preferably Ivy institution, and an open invitation to write for the most prestigious opinion magazines and book reviews. Think Gordon Wood and (still) Arthur Schlesinger Jr., with Sean Wilentz one middle-aged prince. As Brown puts it of Hofstadter, "For nearly 30 years ... he wrote the best books for the best publisher, won the best prizes, and taught in the best city, at the best school, at the best time."
Translation: The hugely influential Social Darwinism in American Thought, 1860-1915 (1944), published when the author was 28, was followed by his extremely significant The American Political Tradition (1948). Pulitzers greeted The Age of Reform in 1956 and Anti-Intellectualism in American Life in 1964. He held a Columbia professorship for 24 years in the dramatic era from World War II's aftermath through McCarthyism and 60s turmoil.
Brown establishes Hofstadter's sparkling achievements, nearly a dozen books in a quarter century of active scholarship. He rightly attributes his subject's fresh slant in a once largely WASP field to growing up the son of a Polish Jewish father and German Lutheran mother and spending his modest early days as a University of Buffalo undergraduate. Although Hofstadter experienced personal tragedy — his first wife, journalist Felice Swados, died of cancer in 1945 when son Dan was a toddler — he kept to his goal of publishing himself out of a University of Maryland assistant professorship that felt like exile after graduate school at Columbia. In 1945, Henry Steele Commager explained, Columbia began a search for "someone who can really take hold of intellectual history and develop this place as a center for the study of American civilization." It tapped Hofstadter over Arthur Schlesinger Jr. The winner began in 1946 and the rest was, in the best sense, revisionist history.
Hofstadter made clear to readers of American history that the mid-20th-century discipline was up for grabs. The miracle of Protestant liberalism announced by George Bancroft and Francis Parkman, the "Jeffersonian liberal" vs. "Hamiltonian conservative" grudge match offered by Vernon Louis Parrington, and Charles Beard's trust in economic causes all stood ripe for rethinking. Hofstadter, writes Brown, proved "a thoughtful agent of change in a nation rapidly moving away from its Protestant moorings" as he became "a leading interpreter of American liberalism." Where Frederick Jackson Turner had famously proffered American democracy as the upshot of frontier individualism, Hofstadter insisted on giving urban America its due.
Hofstadter, Brown asserts, "enlisted the past to reveal the failings of a time-worn political tradition and by inference highlight the promise of what he believed was a more humane, cosmopolitan, and pluralistic postwar liberalism. Anglo-Saxonism and agrarianism were out. Ethnic diversity and modernity were in. As the old codes gave way, America's need for fresh heroes and new perspectives encouraged Hofstadter to rewrite its history as a prelude to moving its culture." For Hofstadter, Brown summarizes, the WASP worldview was "isolationist, individualistic, nationalistic, and capitalistic," fated to break down "before a sharp cultural realignment shaped by demographic change."
In Social Darwinism, Hofstadter "argued that deeply internalized beliefs moved people, for ultimately whoever controlled the prevailing value system — defining God, morality, politics, and patriotism — won the right to apportion rewards." Hofstadter tackled big ideas because he believed big ideas moved American history. As he wrote critically of his title phenomenon in Social Darwinism, it "offered a comprehensive worldview, uniting under one generalization everything in nature, from protozoa to politics." Rarely able to suspend his critical antennae enough to man actual barricades with true-believing radicals, Hofstadter nonetheless brought a Gramscian critique of hegemony to American history.
Big revisionist ideas, and deft use of coinages such as "status anxiety" and "paranoid style," became Hofstadter's signature. Increasingly, Brown says, the proto-public intellectual "expected his books to ripple through the culture." When The American Political Tradition appeared, Hofstadter's distinguished trade publisher trumpeted it for boldly arguing that "all great parties, even the Populists" were "loyal to the twin principles of property and progress." Hofstadter's stinging revisionist view of Jefferson's agrarian vision, faulting it as obsolete for a nation turning, in Brown's words, "urban, industrial, and ethnic," similarly amounted to a sharp attack on received truth.
Not every conceptual airship flew smoothly for Hofstadter. Although his "career-long defense of intellect" led to Anti-Intellectualism in American Life — the title became iconic — some peers felt that in it the author crossed the line from historian to polemicist, to elitist tribal leader defending his flock.
"Hofstadter's emotional involvement in the contest between intellect and egalitarianism," writes Brown, "transformed Anti-Intellectualism into a personal statement" and his "least satisfying work." The book, Brown argues, simply "attacked conventional Midwestern WASP values," from evangelical Protestantism to egalitarian education. Hofstadter always criticized what his biographer calls "the cult of proletarianism." Brown adds that the somewhat bourgeois professor "loathed demands upon the learned class to bow before the moral superiority of the working class." With too much of that attitude apparent in Anti-Intellectualism, even Hofstadter came to feel he had, in Brown's words, "missed the mark."
Hofstadter died of leukemia. Were he alive today, he'd be 90, the same age as John Hope Franklin, with whom he marched in Montgomery in 1965. We might think of Hofstadter as the John Hope Franklin of urban intellectuals and liberals. Franklin bridles when benighted newspaper types describe him as the magisterial scholar of black American history. He counters that the category is American history, in which blacks played a rather big part. Hofstadter, more a wizardly writer than talented archival digger, did similar yeoman's work in creating new narratives with room for America's ethnic populations, workers, and thinkers. His books show that America's history not only can but must be rewritten by each generation because the nation keeps changing. Who we are today permits us to devalue some facts, elevate others, and even shift the plot line.
A Hofstadter student tells Brown, "Discipleship was a thing he never asked for." Brown accordingly concludes that "there is no Hofstadter school." True in the narrowest academic sense. But Hofstadter's spirit persists in every contemporary American historian who sees the subject afresh. One thinks of the rise of scholarship about women of the Revolutionary era, sparked in part by his former student, Linda Kerber, now a leader of the profession. That Hofstadter, dead at 54, retains the authority of a nonagenarian master, confirms that it's not just grade-school versions of history that pack staying power.
Carlin Romano, critic at large for The Chronicle and literary critic for The Philadelphia Inquirer, teaches philosophy and media theory at the University of Pennsylvania.
http://chronicle.com
Section: The Chronicle Review
Volume 52, Issue 43, Page B12
Copyright © 2006 by The Chronicle of Higher Education
Years ago in Albuquerque, at a conference rich in themes of American Indian philosophy and the Southwest's Spanish legacy, a local journalist tossed a thought at me that I found epiphanic in its elegant yet caustic common sense.
"The difference between the Eastern establishment and us is really simple and geographic," he said, more or less. "You think American history moves from right to left. We think it moves from left to right, except all those folks on the right started heading in our direction. You read American history like Hebrew. We read it like, if you will, Spanish!"
I'd spent two decades appreciating nuances of socially constructed European philosophy. Hypercritical literary attitudes toward theory came naturally. But I realized at that hotel cafe, with internal shamefacedness, that I'd now caught how even the most basic engine of American history lies in the eye of the Americanist, professional or not.
My opaqueness presumably began in experiencing grade-school civics before the American Revolution became a subfield of "Insurrections in Atlantic Civilization," before American exceptionalism yielded to "We Ain't Nothin' Special"-ism, a time when Crispus Attucks and Phillis Wheatley satisfied the faint impulse toward tokenism. For intellectuals as for everyone else, the hardest feat is to break free of the standard history of one's country and religion. We absorb it at an age when critical skills remain weak, when our vulnerability to "natural" truths is at its peak. As Richard Rorty once remarked, most of us have cartoon versions of history and philosophy in our heads, though their hold on us differs.
Eventually, most wised-up readers of history come to agree with the advice of E.H. Carr, cited and honored by David S. Brown, that "Before you study the history, study the historian." The payoff of Brown's effort comes in Richard Hofstadter: An Intellectual Biography (University of Chicago, 2006), an incisive interpretive profile.
In choosing Hofstadter (1916-70) to explore Carr's rubric, Brown, associate professor of history at Elizabethtown College, fixes on a man who occupied a position continually rewarded by America's intellectual establishment, but not often scrutinized: King of American History. It comes with a chair at a prestigious and preferably Ivy institution, and an open invitation to write for the most prestigious opinion magazines and book reviews. Think Gordon Wood and (still) Arthur Schlesinger Jr., with Sean Wilentz one middle-aged prince. As Brown puts it of Hofstadter, "For nearly 30 years ... he wrote the best books for the best publisher, won the best prizes, and taught in the best city, at the best school, at the best time."
Translation: The hugely influential Social Darwinism in American Thought, 1860-1915 (1944), published when the author was 28, was followed by his extremely significant The American Political Tradition (1948). Pulitzers greeted The Age of Reform in 1956 and Anti-Intellectualism in American Life in 1964. He held a Columbia professorship for 24 years in the dramatic era from World War II's aftermath through McCarthyism and 60s turmoil.
Brown establishes Hofstadter's sparkling achievements, nearly a dozen books in a quarter century of active scholarship. He rightly attributes his subject's fresh slant in a once largely WASP field to growing up the son of a Polish Jewish father and German Lutheran mother and spending his modest early days as a University of Buffalo undergraduate. Although Hofstadter experienced personal tragedy — his first wife, journalist Felice Swados, died of cancer in 1945 when son Dan was a toddler — he kept to his goal of publishing himself out of a University of Maryland assistant professorship that felt like exile after graduate school at Columbia. In 1945, Henry Steele Commager explained, Columbia began a search for "someone who can really take hold of intellectual history and develop this place as a center for the study of American civilization." It tapped Hofstadter over Arthur Schlesinger Jr. The winner began in 1946 and the rest was, in the best sense, revisionist history.
Hofstadter made clear to readers of American history that the mid-20th-century discipline was up for grabs. The miracle of Protestant liberalism announced by George Bancroft and Francis Parkman, the "Jeffersonian liberal" vs. "Hamiltonian conservative" grudge match offered by Vernon Louis Parrington, and Charles Beard's trust in economic causes all stood ripe for rethinking. Hofstadter, writes Brown, proved "a thoughtful agent of change in a nation rapidly moving away from its Protestant moorings" as he became "a leading interpreter of American liberalism." Where Frederick Jackson Turner had famously proffered American democracy as the upshot of frontier individualism, Hofstadter insisted on giving urban America its due.
Hofstadter, Brown asserts, "enlisted the past to reveal the failings of a time-worn political tradition and by inference highlight the promise of what he believed was a more humane, cosmopolitan, and pluralistic postwar liberalism. Anglo-Saxonism and agrarianism were out. Ethnic diversity and modernity were in. As the old codes gave way, America's need for fresh heroes and new perspectives encouraged Hofstadter to rewrite its history as a prelude to moving its culture." For Hofstadter, Brown summarizes, the WASP worldview was "isolationist, individualistic, nationalistic, and capitalistic," fated to break down "before a sharp cultural realignment shaped by demographic change."
In Social Darwinism, Hofstadter "argued that deeply internalized beliefs moved people, for ultimately whoever controlled the prevailing value system — defining God, morality, politics, and patriotism — won the right to apportion rewards." Hofstadter tackled big ideas because he believed big ideas moved American history. As he wrote critically of his title phenomenon in Social Darwinism, it "offered a comprehensive worldview, uniting under one generalization everything in nature, from protozoa to politics." Rarely able to suspend his critical antennae enough to man actual barricades with true-believing radicals, Hofstadter nonetheless brought a Gramscian critique of hegemony to American history.
Big revisionist ideas, and deft use of coinages such as "status anxiety" and "paranoid style," became Hofstadter's signature. Increasingly, Brown says, the proto-public intellectual "expected his books to ripple through the culture." When The American Political Tradition appeared, Hofstadter's distinguished trade publisher trumpeted it for boldly arguing that "all great parties, even the Populists" were "loyal to the twin principles of property and progress." Hofstadter's stinging revisionist view of Jefferson's agrarian vision, faulting it as obsolete for a nation turning, in Brown's words, "urban, industrial, and ethnic," similarly amounted to a sharp attack on received truth.
Not every conceptual airship flew smoothly for Hofstadter. Although his "career-long defense of intellect" led to Anti-Intellectualism in American Life — the title became iconic — some peers felt that in it the author crossed the line from historian to polemicist, to elitist tribal leader defending his flock.
"Hofstadter's emotional involvement in the contest between intellect and egalitarianism," writes Brown, "transformed Anti-Intellectualism into a personal statement" and his "least satisfying work." The book, Brown argues, simply "attacked conventional Midwestern WASP values," from evangelical Protestantism to egalitarian education. Hofstadter always criticized what his biographer calls "the cult of proletarianism." Brown adds that the somewhat bourgeois professor "loathed demands upon the learned class to bow before the moral superiority of the working class." With too much of that attitude apparent in Anti-Intellectualism, even Hofstadter came to feel he had, in Brown's words, "missed the mark."
Hofstadter died of leukemia. Were he alive today, he'd be 90, the same age as John Hope Franklin, with whom he marched in Montgomery in 1965. We might think of Hofstadter as the John Hope Franklin of urban intellectuals and liberals. Franklin bridles when benighted newspaper types describe him as the magisterial scholar of black American history. He counters that the category is American history, in which blacks played a rather big part. Hofstadter, more a wizardly writer than talented archival digger, did similar yeoman's work in creating new narratives with room for America's ethnic populations, workers, and thinkers. His books show that America's history not only can but must be rewritten by each generation because the nation keeps changing. Who we are today permits us to devalue some facts, elevate others, and even shift the plot line.
A Hofstadter student tells Brown, "Discipleship was a thing he never asked for." Brown accordingly concludes that "there is no Hofstadter school." True in the narrowest academic sense. But Hofstadter's spirit persists in every contemporary American historian who sees the subject afresh. One thinks of the rise of scholarship about women of the Revolutionary era, sparked in part by his former student, Linda Kerber, now a leader of the profession. That Hofstadter, dead at 54, retains the authority of a nonagenarian master, confirms that it's not just grade-school versions of history that pack staying power.
Carlin Romano, critic at large for The Chronicle and literary critic for The Philadelphia Inquirer, teaches philosophy and media theory at the University of Pennsylvania.
http://chronicle.com
Section: The Chronicle Review
Volume 52, Issue 43, Page B12
Copyright © 2006 by The Chronicle of Higher Education
Two richest americans vow to fight "lucky sperm club"
A $31 Billion Gift Between Friends
By LANDON THOMAS Jr.
The friendship between Warren E. Buffett and Bill Gates has been forged over a shared passion for such homespun American treats as cherry Coke, burgers and college football. They delight as well in loftier pursuits, like playing bridge and solving complex math problems.
But, more than anything, what Mr. Buffett's $31 billion gift to the foundation that Mr. Gates runs with his wife, Melinda, shows is a common disdain for inherited wealth and a shared view that the capitalist system that has enriched them so handsomely is not capable alone of addressing the root causes of poverty.
"A market system has not worked in terms of poor people," Mr. Buffett said yesterday, in an interview taped earlier in the day for "The Charlie Rose Show" on PBS.
As for any thought he might have had in giving the bulk of his billions to his three children, Mr. Buffett was characteristically blunt. "I don't believe in dynastic wealth," he said, calling those who grow up in wealthy circumstances "members of the lucky sperm club."
By LANDON THOMAS Jr.
The friendship between Warren E. Buffett and Bill Gates has been forged over a shared passion for such homespun American treats as cherry Coke, burgers and college football. They delight as well in loftier pursuits, like playing bridge and solving complex math problems.
But, more than anything, what Mr. Buffett's $31 billion gift to the foundation that Mr. Gates runs with his wife, Melinda, shows is a common disdain for inherited wealth and a shared view that the capitalist system that has enriched them so handsomely is not capable alone of addressing the root causes of poverty.
"A market system has not worked in terms of poor people," Mr. Buffett said yesterday, in an interview taped earlier in the day for "The Charlie Rose Show" on PBS.
As for any thought he might have had in giving the bulk of his billions to his three children, Mr. Buffett was characteristically blunt. "I don't believe in dynastic wealth," he said, calling those who grow up in wealthy circumstances "members of the lucky sperm club."
Monday, June 26
Social Isolation Growing in U.S., Study Says
Social Isolation Growing in U.S., Study Says
The Number of People Who Say They Have No One to Confide In Has Risen
By Shankar Vedantam, Washington Post Staff Writer
Friday, June 23, 2006; A03
Americans are far more socially isolated today than they were two decades ago, and a sharply growing number of people say they have no one in whom they can confide, according to a comprehensive new evaluation of the decline of social ties in the United States.
A quarter of Americans say they have no one with whom they can discuss personal troubles, more than double the number who were similarly isolated in 1985. Overall, the number of people Americans have in their closest circle of confidants has dropped from around three to about two.
The comprehensive new study paints a sobering picture of an increasingly fragmented America, where intimate social ties -- once seen as an integral part of daily life and associated with a host of psychological and civic benefits -- are shrinking or nonexistent. In bad times, far more people appear to suffer alone.
"That image of people on roofs after Katrina resonates with me, because those people did not know someone with a car," said Lynn Smith-Lovin, a Duke University sociologist who helped conduct the study. "There really is less of a safety net of close friends and confidants."
If close social relationships support people in the same way that beams hold up buildings, more and more Americans appear to be dependent on a single beam.
Compared with 1985, nearly 50 percent more people in 2004 reported that their spouse is the only person they can confide in. But if people face trouble in that relationship, or if a spouse falls sick, that means these people have no one to turn to for help, Smith-Lovin said.
"We know these close ties are what people depend on in bad times," she said. "We're not saying people are completely isolated. They may have 600 friends on Facebook.com [a popular networking Web site] and e-mail 25 people a day, but they are not discussing matters that are personally important."
The new research is based on a high-quality random survey of nearly 1,500 Americans. Telephone surveys miss people who are not home, but the General Social Survey, funded by the National Science Foundation, has a high response rate and conducts detailed face-to-face interviews, in which respondents are pressed to confirm they mean what they say.
Whereas nearly three-quarters of people in 1985 reported they had a friend in whom they could confide, only half in 2004 said they could count on such support. The number of people who said they counted a neighbor as a confidant dropped by more than half, from about 19 percent to about 8 percent.
The results, being published today in the American Sociological Review, took researchers by surprise because they had not expected to see such a steep decline in close social ties.
Smith-Lovin said increased professional responsibilities, including working two or more jobs to make ends meet, and long commutes leave many people too exhausted to seek social -- as well as family -- connections: "Maybe sitting around watching 'Desperate Housewives' . . . is what counts for family interaction."
Robert D. Putnam, a professor of public policy at Harvard and the author of "Bowling Alone," a book about increasing social isolation in the United States, said the new study supports what he has been saying for years to skeptical audiences in the academy.
"For most of the 20th century, Americans were becoming more connected with family and friends, and there was more giving of blood and money, and all of those trend lines turn sharply in the middle '60s and have gone in the other direction ever since," he said.
Americans go on 60 percent fewer picnics today and families eat dinner together 40 percent less often compared with 1965, he said. They are less likely to meet at clubs or go bowling in groups. Putnam has estimated that every 10-minute increase in commutes makes it 10 percent less likely that people will establish and maintain close social ties.
Television is a big part of the problem, he contends. Whereas 5 percent of U.S. households in 1950 owned television sets, 95 percent did a decade later.
But University of Toronto sociologist Barry Wellman questioned whether the study's focus on intimate ties means that social ties in general are fraying. He said people's overall ties are actually growing, compared with previous decades, thanks in part to the Internet. Wellman has calculated that the average person today has about 250 ties with friends and relatives.
Wellman praised the quality of the new study and said its results are surprising, but he said it does not address how core ties change in the context of other relationships.
"I don't see this as the end of the world but part of a larger puzzle," he said. "My guess is people only have so much energy, and right now they are switching around a number of networks. . . . We are getting a division of labor in relationships. Some people give emotional aid, some people give financial aid."
Putnam and Smith-Lovin said Americans may be well advised to consciously build more relationships. But they also said social institutions and social-policy makers need to pay more attention.
"The current structure of workplace regulations assumes everyone works from 9 to 5, five days a week," Putnam said. "If we gave people much more flexibility in their work life, they would use that time to spend more time with their aging mom or best friend."
Read the Study:
http://www.asanet.org/galleries/default-file/June06ASRFeature.pdf
The Number of People Who Say They Have No One to Confide In Has Risen
By Shankar Vedantam, Washington Post Staff Writer
Friday, June 23, 2006; A03
Americans are far more socially isolated today than they were two decades ago, and a sharply growing number of people say they have no one in whom they can confide, according to a comprehensive new evaluation of the decline of social ties in the United States.
A quarter of Americans say they have no one with whom they can discuss personal troubles, more than double the number who were similarly isolated in 1985. Overall, the number of people Americans have in their closest circle of confidants has dropped from around three to about two.
The comprehensive new study paints a sobering picture of an increasingly fragmented America, where intimate social ties -- once seen as an integral part of daily life and associated with a host of psychological and civic benefits -- are shrinking or nonexistent. In bad times, far more people appear to suffer alone.
"That image of people on roofs after Katrina resonates with me, because those people did not know someone with a car," said Lynn Smith-Lovin, a Duke University sociologist who helped conduct the study. "There really is less of a safety net of close friends and confidants."
If close social relationships support people in the same way that beams hold up buildings, more and more Americans appear to be dependent on a single beam.
Compared with 1985, nearly 50 percent more people in 2004 reported that their spouse is the only person they can confide in. But if people face trouble in that relationship, or if a spouse falls sick, that means these people have no one to turn to for help, Smith-Lovin said.
"We know these close ties are what people depend on in bad times," she said. "We're not saying people are completely isolated. They may have 600 friends on Facebook.com [a popular networking Web site] and e-mail 25 people a day, but they are not discussing matters that are personally important."
The new research is based on a high-quality random survey of nearly 1,500 Americans. Telephone surveys miss people who are not home, but the General Social Survey, funded by the National Science Foundation, has a high response rate and conducts detailed face-to-face interviews, in which respondents are pressed to confirm they mean what they say.
Whereas nearly three-quarters of people in 1985 reported they had a friend in whom they could confide, only half in 2004 said they could count on such support. The number of people who said they counted a neighbor as a confidant dropped by more than half, from about 19 percent to about 8 percent.
The results, being published today in the American Sociological Review, took researchers by surprise because they had not expected to see such a steep decline in close social ties.
Smith-Lovin said increased professional responsibilities, including working two or more jobs to make ends meet, and long commutes leave many people too exhausted to seek social -- as well as family -- connections: "Maybe sitting around watching 'Desperate Housewives' . . . is what counts for family interaction."
Robert D. Putnam, a professor of public policy at Harvard and the author of "Bowling Alone," a book about increasing social isolation in the United States, said the new study supports what he has been saying for years to skeptical audiences in the academy.
"For most of the 20th century, Americans were becoming more connected with family and friends, and there was more giving of blood and money, and all of those trend lines turn sharply in the middle '60s and have gone in the other direction ever since," he said.
Americans go on 60 percent fewer picnics today and families eat dinner together 40 percent less often compared with 1965, he said. They are less likely to meet at clubs or go bowling in groups. Putnam has estimated that every 10-minute increase in commutes makes it 10 percent less likely that people will establish and maintain close social ties.
Television is a big part of the problem, he contends. Whereas 5 percent of U.S. households in 1950 owned television sets, 95 percent did a decade later.
But University of Toronto sociologist Barry Wellman questioned whether the study's focus on intimate ties means that social ties in general are fraying. He said people's overall ties are actually growing, compared with previous decades, thanks in part to the Internet. Wellman has calculated that the average person today has about 250 ties with friends and relatives.
Wellman praised the quality of the new study and said its results are surprising, but he said it does not address how core ties change in the context of other relationships.
"I don't see this as the end of the world but part of a larger puzzle," he said. "My guess is people only have so much energy, and right now they are switching around a number of networks. . . . We are getting a division of labor in relationships. Some people give emotional aid, some people give financial aid."
Putnam and Smith-Lovin said Americans may be well advised to consciously build more relationships. But they also said social institutions and social-policy makers need to pay more attention.
"The current structure of workplace regulations assumes everyone works from 9 to 5, five days a week," Putnam said. "If we gave people much more flexibility in their work life, they would use that time to spend more time with their aging mom or best friend."
Read the Study:
http://www.asanet.org/galleries/default-file/June06ASRFeature.pdf
Sunday, June 25
After 5 of the 16 faculty at Patrick Henry Depart, the Question: Are Christianity and Liberal Arts Contradictory Missions?
After 5 of the 16 faculty at Patrick Henry Depart, the Question: Are Christianity and Liberal Arts Contradictory Missions?
By THOMAS BARTLETT
The Chronicle of Higher Education
For such a small college, Patrick Henry sure attracts big-time attention. It has been featured in The New York Times, USA Today, and The Economist. The New Yorker also weighed in with a lengthy piece. Just last month the Discovery Times Channel broadcast an in-depth documentary on Patrick Henry titled God's Next Army.
Not bad for a college that opened its doors in 2000 and has around 300 students. The news media see Patrick Henry as the Christian college with a right-wing political agenda -- and for good reason. Its stated mission is to prepare undergraduates, most of whom were home-schooled, to "lead our nation and shape our culture with timeless biblical values."
Patrick Henry students have interned for top Republican honchos like Bill Frist, Tom DeLay, and Karl Rove. Janet Ashcroft, wife of the former attorney general, sits on the Board of Trustees. The college makes no secret of its ideological leanings; rather, it flaunts them.
It also boasts of its classical liberal-arts curriculum. The vision was that Patrick Henry would become as rigorous and prestigious as an Ivy League institution, but with a firm religious grounding -- a faith-based Harvard.
But a core group of Patrick Henry professors now question that much-publicized commitment to the liberal arts. Five of its 16 full-time professors, two of whom have been there from the beginning, are leaving after a bitter battle over academic freedom. Their departures have shaken the campus and created doubts about the college's future.
They have also raised questions that cut to the heart of Christian higher education, such as: Can a Christian find truth in the writings of non-Christians? What role should the Bible play in the classroom? Should a Christian college student grapple with the same philosophers and the same issues as any other student? Or are certain ideas too worldly to address?
The controversy has pitted the college's president and founder, Michael P. Farris, against many of its professors. He has challenged their fidelity to a biblical worldview, and they have challenged his commitment to the liberal arts. "When he accuses us of not buying into the vision of the college, we have to scratch our heads," said M. Todd Bates, an assistant professor of rhetoric, who is leaving after this semester. "We came here because of the vision. The question is: What has happened to that vision?"
Saint in Hell?
It began with Augustine. There were already simmering tensions between some professors and the president -- over both academic freedom and the way the college disciplines students -- but the bishop of Hippo became the tipping point.
Augustine's pursuit of truth was the topic of a campuswide lecture delivered last fall by Mr. Bates. The lecture was part of a new tradition at Patrick Henry. The idea was that each fall a professor would deliver a lecture and then, in the spring, a guest speaker would talk on the same topic. Classes were canceled that day so all students could attend. It was designed to bring the campus together.
It led, instead, to a deep division. When Mr. Farris, the president, saw the text of Mr. Bates's lecture the day before it was given, he was not pleased.
And when Mr. Farris is not pleased, he is not afraid to let people know.
Before starting Patrick Henry, Mr. Farris was president of the Home School Legal Defense Association, which he founded in 1983. He is 54 years old, though he could pass for 44 ("I have young hair," he says). He is a constitutional lawyer and an ordained minister.
Patrick Henry is an hour outside Washington, in the town of Purcellville, Va., population 3,500. Its small campus feels empty: The main administration building faces a field where a 106,000-square-foot student center is planned (a groundbreaking ceremony was held last week). Mr. Farris's second-floor office overlooks another field. There is, in other words, plenty of room for growth. The president has grand plans, along with high-profile backers like Tim LaHaye, co-author of Left Behind, the best-selling Christian book series, who can help make those plans happen.
There are still hurdles to be cleared -- accreditation among them. The Southern Association of Colleges and Schools rejected Patrick Henry's accreditation bid. The American Academy for Liberal Education first turned down the application. The college appealed, and later withdrew it. The college is now a candidate for accreditation with the Transnational Association of Christian Colleges and Schools.
The president is understandably proud of the college he started and enjoys talking about, for example, the larger library that's in the works, or the cafeteria overhaul. He seems to revel in the details. He also seems to know most of the students by name, greeting them as they pass in the hallway.
When he's not playing campus guide, Mr. Farris can be assertive and direct. In these moments, his courtroom experience becomes evident. Mr. Bates's lecture, according to Mr. Farris, did not sufficiently reflect the college's Christian mission. "The original version was 24 pages long, and it didn't mention the Bible," he says.
Mr. Farris says he simply asked Mr. Bates: "Can't we say something about the Bible?"
According to the professor and Paul Bonicelli, who was dean of academic affairs at the time, the president first threatened to cancel the lecture. He relented after certain changes and additions were made.
But a draft of the speech that Mr. Bates says is the original mentions "Scripture" in the second paragraph. Mr. Bates wrote how Christians can draw on "divine revelation" and encouraged students to become the kind of people who "may shape the culture to the glory of God." The professor sent the lecture to faculty members and administrators before he delivered it. Mr. Farris, he says, was the only person who objected.
Mr. Bates says he and another faculty member discussed the speech with the president for nearly two hours. During that discussion, according to Mr. Bates, Mr. Farris said he believed that Augustine was in hell. (Another faculty member says Mr. Farris told him the same thing.) Mr. Farris denies saying this, though he admits having reservations about the famous saint.
Many professors saw the president's interference as heavy-handed and worrisome. If a professor could be scolded and forced to alter a lecture on a well-known Christian figure, what would happen if a lecture were given on Machiavelli or Marx?
After the incident, nine professors, including Mr. Bates, reached an informal understanding. If one of them were reprimanded or dismissed unfairly, the others would come to his or her aid. There were no legally binding oaths, and nothing was written down -- "it's not like we signed anything in blood," as one professor put it -- but it was an agreement just the same.
The Lifeboat Example
Their pact was soon put to the test.
This semester, the father of a student wrote a letter to the president complaining about one of his daughter's professors. The professor, Erik S. Root, who teaches government and was among the nine, used what is sometimes called the "lifeboat example" in one of his classes. Mr. Root asked students to imagine that two people are clinging to an inner tube in the middle of the ocean. The inner tube can support only one of them, so someone has to let go. The example was part of a discussion on the state of nature. "What would Hobbes and Locke have to say about this?" Mr. Root asked the class.
One student answered by quoting John 15:13: "Greater love hath no man than this, that a man lay down his life for his friends."
"That's great, but it's too simplistic," Mr. Root says he told her. "Can we flesh that out?"
The discussion continued, and the professor brought up other moral quandaries, like whether a soldier should fall on a grenade to save his comrades. The lifeboat example was mentioned in passing; it was not the central theme of the lecture.
But the father of the student who quoted the Bible happened to be sitting in on the class that day. He was not happy with the professor's response -- and he wrote the president a letter telling him so.
Mr. Farris was bothered by the allegations made in the letter. "That's troubling if he told her that was simplistic," he said. "If he did what was alleged, then he should not be back here next year."
For the record, Mr. Root is a Christian who believes that the Bible is the inspired, infallible word of God. He says he wasn't denigrating it or the student with his response. In this context, he argues, quoting the verse does not fully answer the question. He wanted to keep the discussion going -- and that is what he did.
Mr. Farris, through a dean, demanded an explanation. The professor insisted that the president's concerns be put in writing. Mr. Farris responded with a list of seven questions, only one of which dealt with the lifeboat example. The others were related to an article Mr. Root had written on -- who else? -- St. Augustine. But it was the lifeboat example that aroused the president's ire.
If Mr. Farris had simply asked for an explanation, according to Mr. Root, there would have been no problem. He would have gladly offered one. But Mr. Farris also informed the professor that he was withholding his contract for the following year until he received satisfactory answers to his questions (there is no tenure at Patrick Henry). In other words, his job was in danger.
'Stand on Principle'
When Mr. Bates heard about Mr. Root, he felt like he had been kicked in the chest. "We knew it could happen to any of us," he said. "So when it did happen, it confirmed that uneasiness."
Four of the professors who agreed to the pact, including Mr. Bates, followed through. It was not a decision any of them took lightly. For starters, they all believed in the Patrick Henry vision and wanted to be part of making it a reality. There were also practical considerations. "When you have a family and a mortgage, it's a serious decision," said David C. Noe, an assistant professor of classics and one of the four. "I have to put food on the table." Still, the professors concluded that a promise was a promise. If Mr. Root's contract was being withheld, then they would decline to sign theirs.
Mr. Noe calls it a "question of honor." Mr. Bates sees it in similar terms: "I had to come home and be able to look my kids in the eyes," he said, "and let them know that they have a dad they can respect, who will stand on principle."
The five men are not just colleagues, but good friends. All except Mr. Root have young children, and their families know one another and regularly get together. Even when discussing the controversy, which has been painful for all of them, there is laughter and good-natured ribbing. They were close before, and recent events seem to have brought them closer.
The president, for his part, says he is "baffled" by the professors' decisions.
But what happened to Mr. Root seemed, to them, like simply the latest outrage, further proof that the president had little respect for them or genuine academic inquiry.
Around that time, Mr. Noe and J. Kevin Culberson, an assistant professor of history and literature who is also among the four, wrote an article for the college's student magazine. The article was a "line in the sand," according to Mr. Noe. In it he and Mr. Culberson argued against the idea that "the Bible is the only source of truth." There is, they wrote, "much wisdom to be gained from Parmenides and Plato, as well as Machiavelli and Marx."
The article says that while the Bible contains "everything we need to know for reconciliation with God," it does not tell us "how to fix a door jamb or file a brief in appellate court." In other words, there is value in books other than the Bible. What's more, they argued, Christians have an obligation to seek out truth wherever it may be found.
This 900-word article prompted a 2,600-word response from the college's chaplain that was officially endorsed by the president. The lengthy rebuttal, which was sent via e-mail to professors and students, questions the value of reading Machiavelli. It also strongly implies that the professors' column violated the college's statement of biblical worldview.
When asked recently what he objects to, Mr. Farris scanned the article on his computer screen quietly for a few moments. He then wondered aloud whether it had been altered to remove the offending phrases. "I think it may have been edited," he said. "That would be an interesting story." After reviewing another copy of the article, this one provided by his secretary, he conceded that the online version had not been edited.
He pointed to a line that says, "Clearly there is no greater good than knowledge...."
That, he said, is contrary to the Bible.
Yet the rest of the sentence clarifies their intent: "... for without knowledge, there can be no use of any other gift which God imparts." The professors say the knowledge they wrote about includes knowledge of God. And what, they wonder, is so unbiblical about that?
The article also says that "the best place to find evidence of Providential handiwork is not in mountains, birdsong, and sunsets, but in the works of men."
Mr. Farris does not agree. "I don't think the musings of Dennis Rodman are superior to the beauty of Mount Rainier," he said. The professors, however, mentioned some of history's greatest thinkers, not a former professional basketball player best known for tattoos and dating Madonna.
Withholding His Blessing
The day after the article was published, the four professors and Mr. Root told the president they would be leaving the college. They also asked the president's permission to inform students of the reasons for their departures. Rumors flew around the tightly knit community. Some found expression in responses to a short item about the controversy that was posted on The Chronicle's News Blog.
Some students wondered if the professors now rejected the college's statements of faith and biblical worldview.
The professors' faith in Christianity was questioned, too. In an interview with the student newspaper, the president seemed to doubt the professors' belief in the Bible. He later sent an e-mail message to the campus clarifying his comments by saying he believes "they have a sincere desire to honor the Bible as God's authoritative Word."
The professors wanted to clear the air. They say they all believe in the Bible and remain committed Christians. But they were worried that explaining to students why they were leaving might be cause for dismissal. All five men had several months left on their contracts and needed time to find other employment.
No answer was forthcoming. When asked why he declined to give that permission, Mr. Farris offered an analogy: "I have adult children," he said. "They can come to me for advice or blessing. If someone wants to ask for my blessing, that means I agree with them. If they ask for advice, I can tell them and they'll go do whatever they want."
In this analogy, the professors were seeking his blessing, which he was not willing to give.
Despite that lack of assurance, Robert Stacey, chairman of the government department and one of the four professors, read aloud the college's statements of faith and biblical worldview in one of his classes. He told students that if they thought he had been unfaithful to these statements, they should approach him personally. Or, if they felt he was beyond redemption, they should leave the classroom rather than sit under his instruction. He said he would not hold it against anyone who left. About 15 minutes later, one student left. Everyone else stayed.
When the president heard what Mr. Stacey had said, he fired him.
Mr. Farris explained that Mr. Stacey was "forcing students to leave the classroom if they disagreed with him." He mentioned the student who left the class. "What's that girl supposed to do?" he asked.
'Foolishness With God'
Several students interviewed by The Chronicle, like Shant Boyajian, a junior government and public-policy major, believe the professors were treated unfairly. "I think the professors made a courageous stand in the face of an injustice that was done against them," he said.
Not all students feel that way. Jeremiah Lorrig, a senior government major, believes that it is better for the professors to leave because they weren't doing what the president asked of them. "If I work at McDonald's, I should want to sell hamburgers," he said. "If I want to sell tacos, I shouldn't work there."
Another student, who asked to remain anonymous because he fears retaliation, summed it up this way: "The professors believe there is truth to be obtained from Plato and Aristotle. While Dr. Farris would in theory advocate reading Nietzsche, he doesn't actually see the liberal arts as a good way to find truth."
Mr. Farris said he does value works other than the Bible. He teaches a course on the Constitution and pointed out that he rarely quotes the Bible in there. "I'm the one who started the college, and I'm the one who articulated the vision," he said. "What I don't think is that we take the Greek philosophers and swallow it whole. I believe what the Scripture said, and that is that the wisdom of this world is foolishness with God."
After this semester, Mr. Farris will become chancellor, and Graham Walker, who was formerly vice president for academic affairs at Oklahoma Wesleyan University, will become Patrick Henry's second president. Mr. Farris wonders why, if he is stepping aside, the professors still decided to leave. "If I'm the problem -- well, I'm going to be gone," he said.
The professors say they doubt that Mr. Farris would ever relinquish control of the college, regardless of his title. "Besides," said Mr. Noe, "it wouldn't right past wrongs."
Mr. Bonicelli, the former academic-affairs dean, was there when the college began and helped write the statements of faith and biblical worldview. He believes that the president is not truly committed to the liberal arts. Mr. Bonicelli, who left what he calls his "dream job" because of disagreements with how Mr. Farris ran the college, called the firing of Mr. Stacey "absurd" and said that comments Mr. Farris has made about the value of reading Augustine and Plato were "horrific."
He also believes that the college -- which has gotten so much attention during its brief existence -- will never be the same after the professors' departures. "I think the trajectory now is that it will become a Bible college of sorts, not an excellent Christian liberal-arts college," he said. "The college as we created it -- I don't think that can be saved."
By THOMAS BARTLETT
The Chronicle of Higher Education
For such a small college, Patrick Henry sure attracts big-time attention. It has been featured in The New York Times, USA Today, and The Economist. The New Yorker also weighed in with a lengthy piece. Just last month the Discovery Times Channel broadcast an in-depth documentary on Patrick Henry titled God's Next Army.
Not bad for a college that opened its doors in 2000 and has around 300 students. The news media see Patrick Henry as the Christian college with a right-wing political agenda -- and for good reason. Its stated mission is to prepare undergraduates, most of whom were home-schooled, to "lead our nation and shape our culture with timeless biblical values."
Patrick Henry students have interned for top Republican honchos like Bill Frist, Tom DeLay, and Karl Rove. Janet Ashcroft, wife of the former attorney general, sits on the Board of Trustees. The college makes no secret of its ideological leanings; rather, it flaunts them.
It also boasts of its classical liberal-arts curriculum. The vision was that Patrick Henry would become as rigorous and prestigious as an Ivy League institution, but with a firm religious grounding -- a faith-based Harvard.
But a core group of Patrick Henry professors now question that much-publicized commitment to the liberal arts. Five of its 16 full-time professors, two of whom have been there from the beginning, are leaving after a bitter battle over academic freedom. Their departures have shaken the campus and created doubts about the college's future.
They have also raised questions that cut to the heart of Christian higher education, such as: Can a Christian find truth in the writings of non-Christians? What role should the Bible play in the classroom? Should a Christian college student grapple with the same philosophers and the same issues as any other student? Or are certain ideas too worldly to address?
The controversy has pitted the college's president and founder, Michael P. Farris, against many of its professors. He has challenged their fidelity to a biblical worldview, and they have challenged his commitment to the liberal arts. "When he accuses us of not buying into the vision of the college, we have to scratch our heads," said M. Todd Bates, an assistant professor of rhetoric, who is leaving after this semester. "We came here because of the vision. The question is: What has happened to that vision?"
Saint in Hell?
It began with Augustine. There were already simmering tensions between some professors and the president -- over both academic freedom and the way the college disciplines students -- but the bishop of Hippo became the tipping point.
Augustine's pursuit of truth was the topic of a campuswide lecture delivered last fall by Mr. Bates. The lecture was part of a new tradition at Patrick Henry. The idea was that each fall a professor would deliver a lecture and then, in the spring, a guest speaker would talk on the same topic. Classes were canceled that day so all students could attend. It was designed to bring the campus together.
It led, instead, to a deep division. When Mr. Farris, the president, saw the text of Mr. Bates's lecture the day before it was given, he was not pleased.
And when Mr. Farris is not pleased, he is not afraid to let people know.
Before starting Patrick Henry, Mr. Farris was president of the Home School Legal Defense Association, which he founded in 1983. He is 54 years old, though he could pass for 44 ("I have young hair," he says). He is a constitutional lawyer and an ordained minister.
Patrick Henry is an hour outside Washington, in the town of Purcellville, Va., population 3,500. Its small campus feels empty: The main administration building faces a field where a 106,000-square-foot student center is planned (a groundbreaking ceremony was held last week). Mr. Farris's second-floor office overlooks another field. There is, in other words, plenty of room for growth. The president has grand plans, along with high-profile backers like Tim LaHaye, co-author of Left Behind, the best-selling Christian book series, who can help make those plans happen.
There are still hurdles to be cleared -- accreditation among them. The Southern Association of Colleges and Schools rejected Patrick Henry's accreditation bid. The American Academy for Liberal Education first turned down the application. The college appealed, and later withdrew it. The college is now a candidate for accreditation with the Transnational Association of Christian Colleges and Schools.
The president is understandably proud of the college he started and enjoys talking about, for example, the larger library that's in the works, or the cafeteria overhaul. He seems to revel in the details. He also seems to know most of the students by name, greeting them as they pass in the hallway.
When he's not playing campus guide, Mr. Farris can be assertive and direct. In these moments, his courtroom experience becomes evident. Mr. Bates's lecture, according to Mr. Farris, did not sufficiently reflect the college's Christian mission. "The original version was 24 pages long, and it didn't mention the Bible," he says.
Mr. Farris says he simply asked Mr. Bates: "Can't we say something about the Bible?"
According to the professor and Paul Bonicelli, who was dean of academic affairs at the time, the president first threatened to cancel the lecture. He relented after certain changes and additions were made.
But a draft of the speech that Mr. Bates says is the original mentions "Scripture" in the second paragraph. Mr. Bates wrote how Christians can draw on "divine revelation" and encouraged students to become the kind of people who "may shape the culture to the glory of God." The professor sent the lecture to faculty members and administrators before he delivered it. Mr. Farris, he says, was the only person who objected.
Mr. Bates says he and another faculty member discussed the speech with the president for nearly two hours. During that discussion, according to Mr. Bates, Mr. Farris said he believed that Augustine was in hell. (Another faculty member says Mr. Farris told him the same thing.) Mr. Farris denies saying this, though he admits having reservations about the famous saint.
Many professors saw the president's interference as heavy-handed and worrisome. If a professor could be scolded and forced to alter a lecture on a well-known Christian figure, what would happen if a lecture were given on Machiavelli or Marx?
After the incident, nine professors, including Mr. Bates, reached an informal understanding. If one of them were reprimanded or dismissed unfairly, the others would come to his or her aid. There were no legally binding oaths, and nothing was written down -- "it's not like we signed anything in blood," as one professor put it -- but it was an agreement just the same.
The Lifeboat Example
Their pact was soon put to the test.
This semester, the father of a student wrote a letter to the president complaining about one of his daughter's professors. The professor, Erik S. Root, who teaches government and was among the nine, used what is sometimes called the "lifeboat example" in one of his classes. Mr. Root asked students to imagine that two people are clinging to an inner tube in the middle of the ocean. The inner tube can support only one of them, so someone has to let go. The example was part of a discussion on the state of nature. "What would Hobbes and Locke have to say about this?" Mr. Root asked the class.
One student answered by quoting John 15:13: "Greater love hath no man than this, that a man lay down his life for his friends."
"That's great, but it's too simplistic," Mr. Root says he told her. "Can we flesh that out?"
The discussion continued, and the professor brought up other moral quandaries, like whether a soldier should fall on a grenade to save his comrades. The lifeboat example was mentioned in passing; it was not the central theme of the lecture.
But the father of the student who quoted the Bible happened to be sitting in on the class that day. He was not happy with the professor's response -- and he wrote the president a letter telling him so.
Mr. Farris was bothered by the allegations made in the letter. "That's troubling if he told her that was simplistic," he said. "If he did what was alleged, then he should not be back here next year."
For the record, Mr. Root is a Christian who believes that the Bible is the inspired, infallible word of God. He says he wasn't denigrating it or the student with his response. In this context, he argues, quoting the verse does not fully answer the question. He wanted to keep the discussion going -- and that is what he did.
Mr. Farris, through a dean, demanded an explanation. The professor insisted that the president's concerns be put in writing. Mr. Farris responded with a list of seven questions, only one of which dealt with the lifeboat example. The others were related to an article Mr. Root had written on -- who else? -- St. Augustine. But it was the lifeboat example that aroused the president's ire.
If Mr. Farris had simply asked for an explanation, according to Mr. Root, there would have been no problem. He would have gladly offered one. But Mr. Farris also informed the professor that he was withholding his contract for the following year until he received satisfactory answers to his questions (there is no tenure at Patrick Henry). In other words, his job was in danger.
'Stand on Principle'
When Mr. Bates heard about Mr. Root, he felt like he had been kicked in the chest. "We knew it could happen to any of us," he said. "So when it did happen, it confirmed that uneasiness."
Four of the professors who agreed to the pact, including Mr. Bates, followed through. It was not a decision any of them took lightly. For starters, they all believed in the Patrick Henry vision and wanted to be part of making it a reality. There were also practical considerations. "When you have a family and a mortgage, it's a serious decision," said David C. Noe, an assistant professor of classics and one of the four. "I have to put food on the table." Still, the professors concluded that a promise was a promise. If Mr. Root's contract was being withheld, then they would decline to sign theirs.
Mr. Noe calls it a "question of honor." Mr. Bates sees it in similar terms: "I had to come home and be able to look my kids in the eyes," he said, "and let them know that they have a dad they can respect, who will stand on principle."
The five men are not just colleagues, but good friends. All except Mr. Root have young children, and their families know one another and regularly get together. Even when discussing the controversy, which has been painful for all of them, there is laughter and good-natured ribbing. They were close before, and recent events seem to have brought them closer.
The president, for his part, says he is "baffled" by the professors' decisions.
But what happened to Mr. Root seemed, to them, like simply the latest outrage, further proof that the president had little respect for them or genuine academic inquiry.
Around that time, Mr. Noe and J. Kevin Culberson, an assistant professor of history and literature who is also among the four, wrote an article for the college's student magazine. The article was a "line in the sand," according to Mr. Noe. In it he and Mr. Culberson argued against the idea that "the Bible is the only source of truth." There is, they wrote, "much wisdom to be gained from Parmenides and Plato, as well as Machiavelli and Marx."
The article says that while the Bible contains "everything we need to know for reconciliation with God," it does not tell us "how to fix a door jamb or file a brief in appellate court." In other words, there is value in books other than the Bible. What's more, they argued, Christians have an obligation to seek out truth wherever it may be found.
This 900-word article prompted a 2,600-word response from the college's chaplain that was officially endorsed by the president. The lengthy rebuttal, which was sent via e-mail to professors and students, questions the value of reading Machiavelli. It also strongly implies that the professors' column violated the college's statement of biblical worldview.
When asked recently what he objects to, Mr. Farris scanned the article on his computer screen quietly for a few moments. He then wondered aloud whether it had been altered to remove the offending phrases. "I think it may have been edited," he said. "That would be an interesting story." After reviewing another copy of the article, this one provided by his secretary, he conceded that the online version had not been edited.
He pointed to a line that says, "Clearly there is no greater good than knowledge...."
That, he said, is contrary to the Bible.
Yet the rest of the sentence clarifies their intent: "... for without knowledge, there can be no use of any other gift which God imparts." The professors say the knowledge they wrote about includes knowledge of God. And what, they wonder, is so unbiblical about that?
The article also says that "the best place to find evidence of Providential handiwork is not in mountains, birdsong, and sunsets, but in the works of men."
Mr. Farris does not agree. "I don't think the musings of Dennis Rodman are superior to the beauty of Mount Rainier," he said. The professors, however, mentioned some of history's greatest thinkers, not a former professional basketball player best known for tattoos and dating Madonna.
Withholding His Blessing
The day after the article was published, the four professors and Mr. Root told the president they would be leaving the college. They also asked the president's permission to inform students of the reasons for their departures. Rumors flew around the tightly knit community. Some found expression in responses to a short item about the controversy that was posted on The Chronicle's News Blog.
Some students wondered if the professors now rejected the college's statements of faith and biblical worldview.
The professors' faith in Christianity was questioned, too. In an interview with the student newspaper, the president seemed to doubt the professors' belief in the Bible. He later sent an e-mail message to the campus clarifying his comments by saying he believes "they have a sincere desire to honor the Bible as God's authoritative Word."
The professors wanted to clear the air. They say they all believe in the Bible and remain committed Christians. But they were worried that explaining to students why they were leaving might be cause for dismissal. All five men had several months left on their contracts and needed time to find other employment.
No answer was forthcoming. When asked why he declined to give that permission, Mr. Farris offered an analogy: "I have adult children," he said. "They can come to me for advice or blessing. If someone wants to ask for my blessing, that means I agree with them. If they ask for advice, I can tell them and they'll go do whatever they want."
In this analogy, the professors were seeking his blessing, which he was not willing to give.
Despite that lack of assurance, Robert Stacey, chairman of the government department and one of the four professors, read aloud the college's statements of faith and biblical worldview in one of his classes. He told students that if they thought he had been unfaithful to these statements, they should approach him personally. Or, if they felt he was beyond redemption, they should leave the classroom rather than sit under his instruction. He said he would not hold it against anyone who left. About 15 minutes later, one student left. Everyone else stayed.
When the president heard what Mr. Stacey had said, he fired him.
Mr. Farris explained that Mr. Stacey was "forcing students to leave the classroom if they disagreed with him." He mentioned the student who left the class. "What's that girl supposed to do?" he asked.
'Foolishness With God'
Several students interviewed by The Chronicle, like Shant Boyajian, a junior government and public-policy major, believe the professors were treated unfairly. "I think the professors made a courageous stand in the face of an injustice that was done against them," he said.
Not all students feel that way. Jeremiah Lorrig, a senior government major, believes that it is better for the professors to leave because they weren't doing what the president asked of them. "If I work at McDonald's, I should want to sell hamburgers," he said. "If I want to sell tacos, I shouldn't work there."
Another student, who asked to remain anonymous because he fears retaliation, summed it up this way: "The professors believe there is truth to be obtained from Plato and Aristotle. While Dr. Farris would in theory advocate reading Nietzsche, he doesn't actually see the liberal arts as a good way to find truth."
Mr. Farris said he does value works other than the Bible. He teaches a course on the Constitution and pointed out that he rarely quotes the Bible in there. "I'm the one who started the college, and I'm the one who articulated the vision," he said. "What I don't think is that we take the Greek philosophers and swallow it whole. I believe what the Scripture said, and that is that the wisdom of this world is foolishness with God."
After this semester, Mr. Farris will become chancellor, and Graham Walker, who was formerly vice president for academic affairs at Oklahoma Wesleyan University, will become Patrick Henry's second president. Mr. Farris wonders why, if he is stepping aside, the professors still decided to leave. "If I'm the problem -- well, I'm going to be gone," he said.
The professors say they doubt that Mr. Farris would ever relinquish control of the college, regardless of his title. "Besides," said Mr. Noe, "it wouldn't right past wrongs."
Mr. Bonicelli, the former academic-affairs dean, was there when the college began and helped write the statements of faith and biblical worldview. He believes that the president is not truly committed to the liberal arts. Mr. Bonicelli, who left what he calls his "dream job" because of disagreements with how Mr. Farris ran the college, called the firing of Mr. Stacey "absurd" and said that comments Mr. Farris has made about the value of reading Augustine and Plato were "horrific."
He also believes that the college -- which has gotten so much attention during its brief existence -- will never be the same after the professors' departures. "I think the trajectory now is that it will become a Bible college of sorts, not an excellent Christian liberal-arts college," he said. "The college as we created it -- I don't think that can be saved."
Administration's outsourcing brings low-quality services at high cost
The Road From K Street to Yusufiya
By FRANK RICH
AS the remains of two slaughtered American soldiers, Pfc. Thomas L. Tucker and Pfc. Kristian Menchaca, were discovered near Yusufiya, Iraq, on Tuesday, a former White House official named David Safavian was convicted in Washington on four charges of lying and obstruction of justice. The three men had something in common: all had enlisted in government service in a time of war. The similarities end there. The difference between Mr. Safavian's kind of public service and that of the soldiers says everything about the disconnect between the government that has sabotaged this war and the brave men and women who have volunteered in good faith to fight it.
Privates Tucker and Menchaca made the ultimate sacrifice. Their bodies were so mutilated that they could be identified only by DNA. Mr. Safavian, by contrast, can be readily identified by smell. His idea of wartime sacrifice overseas was to chew over government business with the Jack Abramoff gang while on a golfing junket in Scotland. But what's most indicative of Mr. Safavian's public service is not his felonies in the Abramoff-Tom DeLay axis of scandal, but his legal activities before his arrest. In his DNA you get a snapshot of the governmental philosophy that has guided the war effort both in Iraq and at home (that would be the Department of Homeland Security) and doomed it to failure.
Mr. Safavian, a former lobbyist, had a hand in federal spending, first as chief of staff of the General Services Administration and then as the White House's chief procurement officer, overseeing a kitty of some $300 billion (plus $62 billion designated for Katrina relief). He arrived to help enforce a Bush management initiative called "competitive sourcing." Simply put, this was a plan to outsource as much of government as possible by forcing federal agencies to compete with private contractors and their K Street lobbyists for huge and lucrative assignments. The initiative's objective, as the C.E.O. administration officially put it, was to deliver "high-quality services to our citizens at the lowest cost."
The result was low-quality services at high cost: the creation of a shadow government of private companies rife with both incompetence and corruption. Last week Representative Henry Waxman, the California Democrat who commissioned the first comprehensive study of Bush administration contracting, revealed that the federal procurement spending supervised for a time by Mr. Safavian had increased by $175 billion between 2000 and 2005. (Halliburton contracts alone, unsurprisingly, went up more than 600 percent.) Nearly 40 cents of every dollar in federal discretionary spending now goes to private companies.
In this favor-driven world of fat contracts awarded to the well-connected, Mr. Safavian was only an aspiring consigliere. He was not powerful enough or in government long enough to do much beyond petty reconnaissance for Mr. Abramoff and his lobbying clients. But the Bush brand of competitive sourcing, with its get-rich-quick schemes and do-little jobs for administration pals, spread like a cancer throughout the executive branch. It explains why tens of thousands of displaced victims of Katrina are still living in trailer shantytowns all these months later. It explains why New York City and Washington just lost 40 percent of their counterterrorism funds. It helps explain why American troops are more likely to be slaughtered than greeted with flowers more than three years after the American invasion of Iraq.
The Department of Homeland Security, in keeping with the Bush administration's original opposition to it, isn't really a government agency at all so much as an empty shell, a networking boot camp for future private contractors dreaming of big paydays. Thanks to an investigation by The Times's Eric Lipton, we know that some two-thirds of the top department executives, including Tom Ridge and his principal deputies, have cashed in on their often brief service by becoming executives, consultants or lobbyists for companies that have received billions of dollars in government contracts. Even John Ashcroft, the first former attorney general in American history known to immediately register as a lobbyist, is selling his Homeland Security connections to interested bidders. "When you got it, flaunt it!" as they say in "The Producers."
To see the impact of such revolving-door cronyism, just look at the Homeland Security process that mandated those cutbacks for New York and Washington. The official in charge, the assistant secretary for grants and training, is Tracy Henke, an Ashcroft apparatchik from the Justice Department who was best known for trying to politicize the findings of its Bureau of Justice Statistics. (So much so that the White House installed her in Homeland Security with a recess appointment, to shield her from protracted Senate scrutiny.) Under Henke math, it follows that St. Louis, in her home state (and Mr. Ashcroft's), has seen its counterterrorism allotment rise by more than 30 percent while that for the cities actually attacked on 9/11 fell. And guess what: the private contractor hired by Homeland Security to consult on Ms. Henke's handiwork, Booz Allen Hamilton, now just happens to employ Greg Rothwell, who was the department's procurement chief until December. Booz Allen recently nailed a $250 million Homeland Security contract for technology consulting.
The continuing Katrina calamity is another fruit of outsourced government. As Alan Wolfe details in "Why Conservatives Can't Govern" in the current Washington Monthly, the die was cast long before the storm hit: the Bush cronies installed at FEMA, first Joe Allbaugh and then Michael Brown, had privatized so many of the agency's programs that there was little government left to manage the disaster even if more competent managers than Brownie had been in charge.
But the most lethal impact of competitive sourcing, as measured in human cost, is playing out in Iraq. In the standard narrative of American failure in the war, the pivotal early error was Donald Rumsfeld's decision to ignore the advice of Gen. Eric Shinseki and others, who warned that several hundred thousand troops would be needed to secure the country once we inherited it. But equally reckless, we can now see, was the administration's lax privatization of the country's reconstruction, often with pet companies and campaign contributors and without safeguards or accountability to guarantee results.
Washington's promises to rebuild Iraq were worth no more than its promises to rebuild New Orleans. The government that has stranded a multitude of Americans in flimsy "housing" on the gulf, where they remain prey for any new natural attacks the hurricane season will bring, is of a philosophical and operational piece with the government that has let down the Iraqi people. Even after we've thrown away some $2 billion of a budgeted $4 billion on improving electricity, many Iraqis have only a few hours of power a day, less than they did under Saddam. At his Rose Garden press conference of June 14, the first American president with an M.B.A. claimed that yet another new set of "benchmarks" would somehow bring progress even after all his previous benchmarks had failed to impede three years of reconstruction catastrophes.
Of the favored companies put in charge of our supposed good works in Iraq, Halliburton is the most notorious. But it is hardly unique. As The Los Angeles Times reported in April, it is the Parsons Corporation that is responsible for the "wholesale failure in two of the most crucial areas of the Iraq reconstruction — health and safety — which were supposed to win Iraqi good will and reduce the threat to American soldiers."
Parsons finished only 20 of 150 planned Iraq health clinics, somehow spending $60 million of the budgeted $186 million for its own management and administration. It failed to build walls around 7 of the 17 security forts it constructed to supposedly stop the flow of terrorists across the Iran border. Last week, reported James Glanz of The New York Times, the Army Corps of Engineers ordered Parsons to abandon construction on a hopeless $99.1 million prison that was two years behind schedule. By the calculation of Representative Waxman, some $30 billion in American taxpayers' money has been squandered on these and other Iraq boondoggles botched by a government adhering to the principle of competitive sourcing.
If we had honored our grand promises to the people we were liberating, Dick Cheney's prediction that we would be viewed as liberators might have had a chance of coming true. Greater loyalty from the civilian population would have helped reduce the threat to American soldiers, who are prey to insurgents in places like Yusufiya. But what we've wrought instead is a variation on Arthur Miller's post-World War II drama, "All My Sons." Working from a true story, Miller told the tragedy of a shoddy contractor whose defectively manufactured aircraft parts led directly to the deaths of a score of Army pilots and implicitly to the death of his own son.
Back then such a scandal was a shocking anomaly. Franklin D. Roosevelt's administration, the very model of big government that the current administration vilifies, never would have trusted private contractors to run the show. Somehow that unwieldy, bloated government took less time to win World War II than George W. Bush's privatized government is taking to blow this one.
By FRANK RICH
AS the remains of two slaughtered American soldiers, Pfc. Thomas L. Tucker and Pfc. Kristian Menchaca, were discovered near Yusufiya, Iraq, on Tuesday, a former White House official named David Safavian was convicted in Washington on four charges of lying and obstruction of justice. The three men had something in common: all had enlisted in government service in a time of war. The similarities end there. The difference between Mr. Safavian's kind of public service and that of the soldiers says everything about the disconnect between the government that has sabotaged this war and the brave men and women who have volunteered in good faith to fight it.
Privates Tucker and Menchaca made the ultimate sacrifice. Their bodies were so mutilated that they could be identified only by DNA. Mr. Safavian, by contrast, can be readily identified by smell. His idea of wartime sacrifice overseas was to chew over government business with the Jack Abramoff gang while on a golfing junket in Scotland. But what's most indicative of Mr. Safavian's public service is not his felonies in the Abramoff-Tom DeLay axis of scandal, but his legal activities before his arrest. In his DNA you get a snapshot of the governmental philosophy that has guided the war effort both in Iraq and at home (that would be the Department of Homeland Security) and doomed it to failure.
Mr. Safavian, a former lobbyist, had a hand in federal spending, first as chief of staff of the General Services Administration and then as the White House's chief procurement officer, overseeing a kitty of some $300 billion (plus $62 billion designated for Katrina relief). He arrived to help enforce a Bush management initiative called "competitive sourcing." Simply put, this was a plan to outsource as much of government as possible by forcing federal agencies to compete with private contractors and their K Street lobbyists for huge and lucrative assignments. The initiative's objective, as the C.E.O. administration officially put it, was to deliver "high-quality services to our citizens at the lowest cost."
The result was low-quality services at high cost: the creation of a shadow government of private companies rife with both incompetence and corruption. Last week Representative Henry Waxman, the California Democrat who commissioned the first comprehensive study of Bush administration contracting, revealed that the federal procurement spending supervised for a time by Mr. Safavian had increased by $175 billion between 2000 and 2005. (Halliburton contracts alone, unsurprisingly, went up more than 600 percent.) Nearly 40 cents of every dollar in federal discretionary spending now goes to private companies.
In this favor-driven world of fat contracts awarded to the well-connected, Mr. Safavian was only an aspiring consigliere. He was not powerful enough or in government long enough to do much beyond petty reconnaissance for Mr. Abramoff and his lobbying clients. But the Bush brand of competitive sourcing, with its get-rich-quick schemes and do-little jobs for administration pals, spread like a cancer throughout the executive branch. It explains why tens of thousands of displaced victims of Katrina are still living in trailer shantytowns all these months later. It explains why New York City and Washington just lost 40 percent of their counterterrorism funds. It helps explain why American troops are more likely to be slaughtered than greeted with flowers more than three years after the American invasion of Iraq.
The Department of Homeland Security, in keeping with the Bush administration's original opposition to it, isn't really a government agency at all so much as an empty shell, a networking boot camp for future private contractors dreaming of big paydays. Thanks to an investigation by The Times's Eric Lipton, we know that some two-thirds of the top department executives, including Tom Ridge and his principal deputies, have cashed in on their often brief service by becoming executives, consultants or lobbyists for companies that have received billions of dollars in government contracts. Even John Ashcroft, the first former attorney general in American history known to immediately register as a lobbyist, is selling his Homeland Security connections to interested bidders. "When you got it, flaunt it!" as they say in "The Producers."
To see the impact of such revolving-door cronyism, just look at the Homeland Security process that mandated those cutbacks for New York and Washington. The official in charge, the assistant secretary for grants and training, is Tracy Henke, an Ashcroft apparatchik from the Justice Department who was best known for trying to politicize the findings of its Bureau of Justice Statistics. (So much so that the White House installed her in Homeland Security with a recess appointment, to shield her from protracted Senate scrutiny.) Under Henke math, it follows that St. Louis, in her home state (and Mr. Ashcroft's), has seen its counterterrorism allotment rise by more than 30 percent while that for the cities actually attacked on 9/11 fell. And guess what: the private contractor hired by Homeland Security to consult on Ms. Henke's handiwork, Booz Allen Hamilton, now just happens to employ Greg Rothwell, who was the department's procurement chief until December. Booz Allen recently nailed a $250 million Homeland Security contract for technology consulting.
The continuing Katrina calamity is another fruit of outsourced government. As Alan Wolfe details in "Why Conservatives Can't Govern" in the current Washington Monthly, the die was cast long before the storm hit: the Bush cronies installed at FEMA, first Joe Allbaugh and then Michael Brown, had privatized so many of the agency's programs that there was little government left to manage the disaster even if more competent managers than Brownie had been in charge.
But the most lethal impact of competitive sourcing, as measured in human cost, is playing out in Iraq. In the standard narrative of American failure in the war, the pivotal early error was Donald Rumsfeld's decision to ignore the advice of Gen. Eric Shinseki and others, who warned that several hundred thousand troops would be needed to secure the country once we inherited it. But equally reckless, we can now see, was the administration's lax privatization of the country's reconstruction, often with pet companies and campaign contributors and without safeguards or accountability to guarantee results.
Washington's promises to rebuild Iraq were worth no more than its promises to rebuild New Orleans. The government that has stranded a multitude of Americans in flimsy "housing" on the gulf, where they remain prey for any new natural attacks the hurricane season will bring, is of a philosophical and operational piece with the government that has let down the Iraqi people. Even after we've thrown away some $2 billion of a budgeted $4 billion on improving electricity, many Iraqis have only a few hours of power a day, less than they did under Saddam. At his Rose Garden press conference of June 14, the first American president with an M.B.A. claimed that yet another new set of "benchmarks" would somehow bring progress even after all his previous benchmarks had failed to impede three years of reconstruction catastrophes.
Of the favored companies put in charge of our supposed good works in Iraq, Halliburton is the most notorious. But it is hardly unique. As The Los Angeles Times reported in April, it is the Parsons Corporation that is responsible for the "wholesale failure in two of the most crucial areas of the Iraq reconstruction — health and safety — which were supposed to win Iraqi good will and reduce the threat to American soldiers."
Parsons finished only 20 of 150 planned Iraq health clinics, somehow spending $60 million of the budgeted $186 million for its own management and administration. It failed to build walls around 7 of the 17 security forts it constructed to supposedly stop the flow of terrorists across the Iran border. Last week, reported James Glanz of The New York Times, the Army Corps of Engineers ordered Parsons to abandon construction on a hopeless $99.1 million prison that was two years behind schedule. By the calculation of Representative Waxman, some $30 billion in American taxpayers' money has been squandered on these and other Iraq boondoggles botched by a government adhering to the principle of competitive sourcing.
If we had honored our grand promises to the people we were liberating, Dick Cheney's prediction that we would be viewed as liberators might have had a chance of coming true. Greater loyalty from the civilian population would have helped reduce the threat to American soldiers, who are prey to insurgents in places like Yusufiya. But what we've wrought instead is a variation on Arthur Miller's post-World War II drama, "All My Sons." Working from a true story, Miller told the tragedy of a shoddy contractor whose defectively manufactured aircraft parts led directly to the deaths of a score of Army pilots and implicitly to the death of his own son.
Back then such a scandal was a shocking anomaly. Franklin D. Roosevelt's administration, the very model of big government that the current administration vilifies, never would have trusted private contractors to run the show. Somehow that unwieldy, bloated government took less time to win World War II than George W. Bush's privatized government is taking to blow this one.
Letter From Bill Keller on The Times's Banking Records Report
The following is a letter Bill Keller, the executive editor of The Times, has sent to readers who have written him about The Times's publication of information about the government's examination of international banking records:
... It's an unusual and powerful thing, this freedom that our founders gave to the press. Who are the editors of The New York Times (or the Wall Street Journal, Los Angeles Times, Washington Post and other publications that also ran the banking story) to disregard the wishes of the President and his appointees? And yet the people who invented this country saw an aggressive, independent press as a protective measure against the abuse of power in a democracy, and an essential ingredient for self-government. They rejected the idea that it is wise, or patriotic, to always take the President at his word, or to surrender to the government important decisions about what to publish.
The power that has been given us is not something to be taken lightly. The responsibility of it weighs most heavily on us when an issue involves national security, and especially national security in times of war. I've only participated in a few such cases, but they are among the most agonizing decisions I've faced as an editor.
The press and the government generally start out from opposite corners in such cases. The government would like us to publish only the official line, and some of our elected leaders tend to view anything else as harmful to the national interest. For example, some members of the Administration have argued over the past three years that when our reporters describe sectarian violence and insurgency in Iraq, we risk demoralizing the nation and giving comfort to the enemy. Editors start from the premise that citizens can be entrusted with unpleasant and complicated news, and that the more they know the better they will be able to make their views known to their elected officials. Our default position — our job — is to publish information if we are convinced it is fair and accurate, and our biggest failures have generally been when we failed to dig deep enough or to report fully enough. After The Times played down its advance knowledge of the Bay of Pigs invasion, President Kennedy reportedly said he wished we had published what we knew and perhaps prevented a fiasco. Some of the reporting in The Times and elsewhere prior to the war in Iraq was criticized for not being skeptical enough of the Administration's claims about the Iraqi threat. The question we start with as journalists is not "why publish?" but "why would we withhold information of significance?" We have sometimes done so, holding stories or editing out details that could serve those hostile to the U.S. But we need a compelling reason to do so.
Forgive me, I know this is pretty elementary stuff — but it's the kind of elementary context that sometimes gets lost in the heat of strong disagreements.
Since September 11, 2001, our government has launched broad and secret anti-terror monitoring programs without seeking authorizing legislation and without fully briefing the Congress. Most Americans seem to support extraordinary measures in defense against this extraordinary threat, but some officials who have been involved in these programs have spoken to the Times about their discomfort over the legality of the government's actions and over the adequacy of oversight. We believe The Times and others in the press have served the public interest by accurately reporting on these programs so that the public can have an informed view of them.
Our decision to publish the story of the Administration's penetration of the international banking system followed weeks of discussion between Administration officials and The Times, not only the reporters who wrote the story but senior editors, including me. We listened patiently and attentively. We discussed the matter extensively within the paper. We spoke to others — national security experts not serving in the Administration — for their counsel. It's worth mentioning that the reporters and editors responsible for this story live in two places — New York and the Washington area — that are tragically established targets for terrorist violence. The question of preventing terror is not abstract to us.
The Administration case for holding the story had two parts, roughly speaking: first that the program is good — that it is legal, that there are safeguards against abuse of privacy, and that it has been valuable in deterring and prosecuting terrorists. And, second, that exposing this program would put its usefulness at risk.
It's not our job to pass judgment on whether this program is legal or effective, but the story cites strong arguments from proponents that this is the case. While some experts familiar with the program have doubts about its legality, which has never been tested in the courts, and while some bank officials worry that a temporary program has taken on an air of permanence, we cited considerable evidence that the program helps catch and prosecute financers of terror, and we have not identified any serious abuses of privacy so far. A reasonable person, informed about this program, might well decide to applaud it. That said, we hesitate to preempt the role of legislators and courts, and ultimately the electorate, which cannot consider a program if they don't know about it.
We weighed most heavily the Administration's concern that describing this program would endanger it. The central argument we heard from officials at senior levels was that international bankers would stop cooperating, would resist, if this program saw the light of day. We don't know what the banking consortium will do, but we found this argument puzzling. First, the bankers provide this information under the authority of a subpoena, which imposes a legal obligation. Second, if, as the Administration says, the program is legal, highly effective, and well protected against invasion of privacy, the bankers should have little trouble defending it. The Bush Administration and America itself may be unpopular in Europe these days, but policing the byways of international terror seems to have pretty strong support everywhere. And while it is too early to tell, the initial signs are that our article is not generating a banker backlash against the program.
By the way, we heard similar arguments against publishing last year's reporting on the NSA eavesdropping program. We were told then that our article would mean the death of that program. We were told that telecommunications companies would — if the public knew what they were doing — withdraw their cooperation. To the best of my knowledge, that has not happened. While our coverage has led to much public debate and new congressional oversight, to the best of our knowledge the eavesdropping program continues to operate much as it did before. Members of Congress have proposed to amend the law to put the eavesdropping program on a firm legal footing. And the man who presided over it and defended it was handily confirmed for promotion as the head of the CIA.
A secondary argument against publishing the banking story was that publication would lead terrorists to change tactics. But that argument was made in a half-hearted way. It has been widely reported — indeed, trumpeted by the Treasury Department — that the U.S. makes every effort to track international financing of terror. Terror financiers know this, which is why they have already moved as much as they can to cruder methods. But they also continue to use the international banking system, because it is immeasurably more efficient than toting suitcases of cash.
I can appreciate that other conscientious people could have gone through the process I've outlined above and come to a different conclusion. But nobody should think that we made this decision casually, with any animus toward the current Administration, or without fully weighing the issues.
... It's an unusual and powerful thing, this freedom that our founders gave to the press. Who are the editors of The New York Times (or the Wall Street Journal, Los Angeles Times, Washington Post and other publications that also ran the banking story) to disregard the wishes of the President and his appointees? And yet the people who invented this country saw an aggressive, independent press as a protective measure against the abuse of power in a democracy, and an essential ingredient for self-government. They rejected the idea that it is wise, or patriotic, to always take the President at his word, or to surrender to the government important decisions about what to publish.
The power that has been given us is not something to be taken lightly. The responsibility of it weighs most heavily on us when an issue involves national security, and especially national security in times of war. I've only participated in a few such cases, but they are among the most agonizing decisions I've faced as an editor.
The press and the government generally start out from opposite corners in such cases. The government would like us to publish only the official line, and some of our elected leaders tend to view anything else as harmful to the national interest. For example, some members of the Administration have argued over the past three years that when our reporters describe sectarian violence and insurgency in Iraq, we risk demoralizing the nation and giving comfort to the enemy. Editors start from the premise that citizens can be entrusted with unpleasant and complicated news, and that the more they know the better they will be able to make their views known to their elected officials. Our default position — our job — is to publish information if we are convinced it is fair and accurate, and our biggest failures have generally been when we failed to dig deep enough or to report fully enough. After The Times played down its advance knowledge of the Bay of Pigs invasion, President Kennedy reportedly said he wished we had published what we knew and perhaps prevented a fiasco. Some of the reporting in The Times and elsewhere prior to the war in Iraq was criticized for not being skeptical enough of the Administration's claims about the Iraqi threat. The question we start with as journalists is not "why publish?" but "why would we withhold information of significance?" We have sometimes done so, holding stories or editing out details that could serve those hostile to the U.S. But we need a compelling reason to do so.
Forgive me, I know this is pretty elementary stuff — but it's the kind of elementary context that sometimes gets lost in the heat of strong disagreements.
Since September 11, 2001, our government has launched broad and secret anti-terror monitoring programs without seeking authorizing legislation and without fully briefing the Congress. Most Americans seem to support extraordinary measures in defense against this extraordinary threat, but some officials who have been involved in these programs have spoken to the Times about their discomfort over the legality of the government's actions and over the adequacy of oversight. We believe The Times and others in the press have served the public interest by accurately reporting on these programs so that the public can have an informed view of them.
Our decision to publish the story of the Administration's penetration of the international banking system followed weeks of discussion between Administration officials and The Times, not only the reporters who wrote the story but senior editors, including me. We listened patiently and attentively. We discussed the matter extensively within the paper. We spoke to others — national security experts not serving in the Administration — for their counsel. It's worth mentioning that the reporters and editors responsible for this story live in two places — New York and the Washington area — that are tragically established targets for terrorist violence. The question of preventing terror is not abstract to us.
The Administration case for holding the story had two parts, roughly speaking: first that the program is good — that it is legal, that there are safeguards against abuse of privacy, and that it has been valuable in deterring and prosecuting terrorists. And, second, that exposing this program would put its usefulness at risk.
It's not our job to pass judgment on whether this program is legal or effective, but the story cites strong arguments from proponents that this is the case. While some experts familiar with the program have doubts about its legality, which has never been tested in the courts, and while some bank officials worry that a temporary program has taken on an air of permanence, we cited considerable evidence that the program helps catch and prosecute financers of terror, and we have not identified any serious abuses of privacy so far. A reasonable person, informed about this program, might well decide to applaud it. That said, we hesitate to preempt the role of legislators and courts, and ultimately the electorate, which cannot consider a program if they don't know about it.
We weighed most heavily the Administration's concern that describing this program would endanger it. The central argument we heard from officials at senior levels was that international bankers would stop cooperating, would resist, if this program saw the light of day. We don't know what the banking consortium will do, but we found this argument puzzling. First, the bankers provide this information under the authority of a subpoena, which imposes a legal obligation. Second, if, as the Administration says, the program is legal, highly effective, and well protected against invasion of privacy, the bankers should have little trouble defending it. The Bush Administration and America itself may be unpopular in Europe these days, but policing the byways of international terror seems to have pretty strong support everywhere. And while it is too early to tell, the initial signs are that our article is not generating a banker backlash against the program.
By the way, we heard similar arguments against publishing last year's reporting on the NSA eavesdropping program. We were told then that our article would mean the death of that program. We were told that telecommunications companies would — if the public knew what they were doing — withdraw their cooperation. To the best of my knowledge, that has not happened. While our coverage has led to much public debate and new congressional oversight, to the best of our knowledge the eavesdropping program continues to operate much as it did before. Members of Congress have proposed to amend the law to put the eavesdropping program on a firm legal footing. And the man who presided over it and defended it was handily confirmed for promotion as the head of the CIA.
A secondary argument against publishing the banking story was that publication would lead terrorists to change tactics. But that argument was made in a half-hearted way. It has been widely reported — indeed, trumpeted by the Treasury Department — that the U.S. makes every effort to track international financing of terror. Terror financiers know this, which is why they have already moved as much as they can to cruder methods. But they also continue to use the international banking system, because it is immeasurably more efficient than toting suitcases of cash.
I can appreciate that other conscientious people could have gone through the process I've outlined above and come to a different conclusion. But nobody should think that we made this decision casually, with any animus toward the current Administration, or without fully weighing the issues.
Saturday, June 24
Reading Leo Strauss
June 25, 2006
'Reading Leo Strauss,' by Steven B. Smith
Neocon or Not?
Review by ROBERT ALTER
FOR a scholar who addressed what the general public would regard as abstruse topics in a dry academic fashion, Leo Strauss has become a name that reverberates widely — and, for many, ominously. He is seen as the seminal thinker behind neoconservatism, its intellectual father.
Born into an Orthodox Jewish home in a small German town in 1899, Strauss was trained in the rigorous discipline of Geistesgeschichte, intellectual history. He began his career in the 1920's in an innovative adult Jewish learning institute. His first book was on Spinoza, and he subsequently devoted scrupulous, often maverick, studies to major figures of political philosophy from Plato and Maimonides to Machiavelli, Hobbes and the framers of the American Constitution. He left Germany in 1932, went to England via Paris, and in 1938 came to the United States. He taught for a decade at the New School in New York and then from 1949 to 1968 at the University of Chicago, where he exerted his greatest influence. He died in 1973.
Strauss was very much caught up in an extraordinary intellectual ferment among German Jews who came of age around the time of World War I. He was friends with Gershom Scholem, the great historian of Jewish mysticism, in the early 1920's. He worked with Franz Rosenzweig, the bold architect of a Jewish existentialist theology. He was admired by Scholem's friend Walter Benjamin, the eminent literary critic and cultural theorist. Like all these thinkers, he was concerned with the tensions between tradition, founded on revelation, and modernity, operating with unaided reason.
How, then, has Strauss come to be viewed as a sinister presence in contemporary politics? Some of his students, or students of his students, went on to become conservative policy intellectuals in Washington. Perhaps the most well known of his disciples, Allan Bloom, remained at the University of Chicago, where he wrote his best-selling book, "The Closing of the American Mind" (1987), a scathing critique of the debasement of American higher education by conformist progressivism. In the mid-1980's, a highly critical article in The New York Review of Books linked Strauss with conservatism, and in the next few years, numerous pieces in other journals followed suit. It has become received wisdom that a direct line issues from Strauss's seminars on political philosophy at the University of Chicago to the hawkish approach to foreign policy by figures like Paul Wolfowitz and others in the Bush administration.
"Reading Leo Strauss," Steven B. Smith's admirably lucid, meticulously argued book, persuasively sets the record straight on Strauss's political views and on what his writing is really about. The epigraph to its introduction, from an essay by the political scientist Joseph Cropsey, sounds the keynote: "Strauss was a towering presence . . . who neither sought nor had any discernible influence on what passes for the politics of the group."
Although it is said that Strauss voted twice for Adlai Stevenson, he appears never to have been involved in any political party or movement. What is more important is that his intellectual enterprise, as Smith's careful exposition makes clear, repeatedly argued against the very idea of political certitude that has been embraced by certain neoconservatives. Strauss's somewhat contrarian reading of Plato's "Republic," for example, proposed that the dialogue was devised precisely to demonstrate the dangerous unfeasibility of a state governed by a philosopher-king.
"Throughout his writings," Smith concludes, "Strauss remained deeply skeptical of whether political theory had any substantive advice or direction to offer statesmen." This view was shaped by his wary observation of the systems of totalitarianism that dominated two major European nations in the 1930's, Nazism in Germany and Communism in the Soviet Union. As a result, he strenuously resisted the notion that politics could have a redemptive effect by radically transforming human existence. Such thinking could scarcely be further from the vision of neoconservative policy intellectuals that the global projection of American power can effect radical democratic change. "The idea," Smith contends, "that political or military action can be used to eradicate evil from the human landscape is closer to the utopian and idealistic visions of Marxism and the radical Enlightenment than anything found in the writings of Strauss."
Liberal democracy lies at the core of Strauss's political views, and its basis is the concept of skepticism. Since there are no certainties in the realm of politics, perhaps not in any realm, politics must be the arena for negotiation between different perspectives, with cautious moderation likely to be the best policy. At one point, Smith, the Alfred Cowles professor of political science at Yale, describes Strauss's position as "liberalism without illusions." All this may sound a little antiquated, and Smith is right to associate Strauss with cold war liberals like Raymond Aron, Isaiah Berlin, Walter Lippmann and Lionel Trilling. But it's a view from the middle of the past century that might profitably be fostered in our own moment of political polarization, when a self-righteous sense of possessing assured truths is prevalent on both the right and the left.
The other general point that Smith makes about Strauss's alleged paternity of neoconservatism is that a considerable part of his work has nothing to do with politics of any sort. Smith divides his book — a collection of previously published essays, inevitably with some repetition among them — into two parts, the first entitled "Jerusalem," the second, "Athens." Strauss used these terms to designate the two poles of Western culture, roughly corresponding to revelation and reason. It is in the "Athens" section that Smith traces Strauss's trajectory through the history of political philosophy. The essays of the "Jerusalem" part, on the other hand, follow his engagement with Maimonides, Spinoza, Scholem and Zionism (a movement that he had embraced from adolescence but that he thought did not alter the metaphysical condition of galut, exile, in which Jews found themselves).
The Jewish-theological side of Strauss certainly had no perceptible effect on his American disciples, most of them Jews and all of them, as far as I know, secular. In these concerns, Strauss was thoroughly the intellectual product of 1920's German Jewry. Like others of that period, including Walter Benjamin, he approached the idea of revealed religion with the utmost seriousness. It does not appear that he remained a believing Jew, yet he was not prepared simply to dismiss the claims of Jerusalem against Athens.
On the contrary, the sweeping agenda of reformist or revolutionary reason first put forth in the Enlightenment worried him deeply, and he saw religion, with its assertion of a different source of truth, as a necessary counterweight to the certitudes of the 18th century. His vision of reality was, to use a term favored by both Scholem and Benjamin, "dialectic." Why some of his most prominent students missed this essential feature of his thought, and why they turned to the right, remains one of the mysteries of his intellectual legacy.
Robert Alter's most recent book is "Imagined Cities: Urban Experience and the Language of the Novel."
'Reading Leo Strauss,' by Steven B. Smith
Neocon or Not?
Review by ROBERT ALTER
FOR a scholar who addressed what the general public would regard as abstruse topics in a dry academic fashion, Leo Strauss has become a name that reverberates widely — and, for many, ominously. He is seen as the seminal thinker behind neoconservatism, its intellectual father.
Born into an Orthodox Jewish home in a small German town in 1899, Strauss was trained in the rigorous discipline of Geistesgeschichte, intellectual history. He began his career in the 1920's in an innovative adult Jewish learning institute. His first book was on Spinoza, and he subsequently devoted scrupulous, often maverick, studies to major figures of political philosophy from Plato and Maimonides to Machiavelli, Hobbes and the framers of the American Constitution. He left Germany in 1932, went to England via Paris, and in 1938 came to the United States. He taught for a decade at the New School in New York and then from 1949 to 1968 at the University of Chicago, where he exerted his greatest influence. He died in 1973.
Strauss was very much caught up in an extraordinary intellectual ferment among German Jews who came of age around the time of World War I. He was friends with Gershom Scholem, the great historian of Jewish mysticism, in the early 1920's. He worked with Franz Rosenzweig, the bold architect of a Jewish existentialist theology. He was admired by Scholem's friend Walter Benjamin, the eminent literary critic and cultural theorist. Like all these thinkers, he was concerned with the tensions between tradition, founded on revelation, and modernity, operating with unaided reason.
How, then, has Strauss come to be viewed as a sinister presence in contemporary politics? Some of his students, or students of his students, went on to become conservative policy intellectuals in Washington. Perhaps the most well known of his disciples, Allan Bloom, remained at the University of Chicago, where he wrote his best-selling book, "The Closing of the American Mind" (1987), a scathing critique of the debasement of American higher education by conformist progressivism. In the mid-1980's, a highly critical article in The New York Review of Books linked Strauss with conservatism, and in the next few years, numerous pieces in other journals followed suit. It has become received wisdom that a direct line issues from Strauss's seminars on political philosophy at the University of Chicago to the hawkish approach to foreign policy by figures like Paul Wolfowitz and others in the Bush administration.
"Reading Leo Strauss," Steven B. Smith's admirably lucid, meticulously argued book, persuasively sets the record straight on Strauss's political views and on what his writing is really about. The epigraph to its introduction, from an essay by the political scientist Joseph Cropsey, sounds the keynote: "Strauss was a towering presence . . . who neither sought nor had any discernible influence on what passes for the politics of the group."
Although it is said that Strauss voted twice for Adlai Stevenson, he appears never to have been involved in any political party or movement. What is more important is that his intellectual enterprise, as Smith's careful exposition makes clear, repeatedly argued against the very idea of political certitude that has been embraced by certain neoconservatives. Strauss's somewhat contrarian reading of Plato's "Republic," for example, proposed that the dialogue was devised precisely to demonstrate the dangerous unfeasibility of a state governed by a philosopher-king.
"Throughout his writings," Smith concludes, "Strauss remained deeply skeptical of whether political theory had any substantive advice or direction to offer statesmen." This view was shaped by his wary observation of the systems of totalitarianism that dominated two major European nations in the 1930's, Nazism in Germany and Communism in the Soviet Union. As a result, he strenuously resisted the notion that politics could have a redemptive effect by radically transforming human existence. Such thinking could scarcely be further from the vision of neoconservative policy intellectuals that the global projection of American power can effect radical democratic change. "The idea," Smith contends, "that political or military action can be used to eradicate evil from the human landscape is closer to the utopian and idealistic visions of Marxism and the radical Enlightenment than anything found in the writings of Strauss."
Liberal democracy lies at the core of Strauss's political views, and its basis is the concept of skepticism. Since there are no certainties in the realm of politics, perhaps not in any realm, politics must be the arena for negotiation between different perspectives, with cautious moderation likely to be the best policy. At one point, Smith, the Alfred Cowles professor of political science at Yale, describes Strauss's position as "liberalism without illusions." All this may sound a little antiquated, and Smith is right to associate Strauss with cold war liberals like Raymond Aron, Isaiah Berlin, Walter Lippmann and Lionel Trilling. But it's a view from the middle of the past century that might profitably be fostered in our own moment of political polarization, when a self-righteous sense of possessing assured truths is prevalent on both the right and the left.
The other general point that Smith makes about Strauss's alleged paternity of neoconservatism is that a considerable part of his work has nothing to do with politics of any sort. Smith divides his book — a collection of previously published essays, inevitably with some repetition among them — into two parts, the first entitled "Jerusalem," the second, "Athens." Strauss used these terms to designate the two poles of Western culture, roughly corresponding to revelation and reason. It is in the "Athens" section that Smith traces Strauss's trajectory through the history of political philosophy. The essays of the "Jerusalem" part, on the other hand, follow his engagement with Maimonides, Spinoza, Scholem and Zionism (a movement that he had embraced from adolescence but that he thought did not alter the metaphysical condition of galut, exile, in which Jews found themselves).
The Jewish-theological side of Strauss certainly had no perceptible effect on his American disciples, most of them Jews and all of them, as far as I know, secular. In these concerns, Strauss was thoroughly the intellectual product of 1920's German Jewry. Like others of that period, including Walter Benjamin, he approached the idea of revealed religion with the utmost seriousness. It does not appear that he remained a believing Jew, yet he was not prepared simply to dismiss the claims of Jerusalem against Athens.
On the contrary, the sweeping agenda of reformist or revolutionary reason first put forth in the Enlightenment worried him deeply, and he saw religion, with its assertion of a different source of truth, as a necessary counterweight to the certitudes of the 18th century. His vision of reality was, to use a term favored by both Scholem and Benjamin, "dialectic." Why some of his most prominent students missed this essential feature of his thought, and why they turned to the right, remains one of the mysteries of his intellectual legacy.
Robert Alter's most recent book is "Imagined Cities: Urban Experience and the Language of the Novel."
Subscribe to:
Posts (Atom)