Saturday, June 30
I woke up at 4 this morning to the realization that I cannot begin to understand how Chief Justice John Roberts and his colleagues could really think that the efforts of the people in Jefferson County, Ky., and Seattle to have white and black students educated together is anything remotely like the system of racial apartheid, subjugation, and servitude practiced in the American South. His concluding sentence, "The way to stop discrimination on the basis of race is to stop discriminating on the basis of race," equates two such fundamentally different practices that it leaves me stunned.
I want to try to convey a sense of how profoundly misguided and ahistorical that conflation is. Let me return to the most rhetorically powerful argument against the two school-district plans in this case: the fact that they would require some parents to say to a child, "You can't go to that school because of your race," just as black parents had to say to their children in the South before Brown.
Think first about being a black parent explaining race to a child in North Carolina in 1963. That year, Gov. Terry Sanford went on statewide television to urge an end to segregation in public accommodations and read a letter from a black soldier stationed at Fort Bragg describing what it was like to drive his children from eastern North Carolina to visit their grandparents in Texas. It was a harrowing experience, he wrote. Planning that trip was like a military operation; every supply that might be needed had to be packed and stuffed in the car for a trip of more than 1,000 miles. When they were hungry, they could not buy food. When they were tired, they knew they would be turned away from the motel. They traveled in fear that a child would become sick on the trip. Day after day they would drive by tourist sites and amusement parks that they could not enter; gas stations at which the children were barred from the restroom. How do you explain to a child why she can't go to the swimming pool, play in the park, or go to the movie? At home or on the road, this was an experience a child of color had repeatedly every day. Every day. And the reason: The child was an inferior being whose very presence in a place would be repulsive to the community.
Is that what happens under the Louisville or Seattle plans? What some parents will sometimes have to say to their children under these plans is something like this: "You will be going to PS 111 instead of PS 109 this year, and here's why: Our community is trying to make sure that we get over the racial separation that has been such a troubled part of our history. So we want to make sure we have a pretty good number of white and black children in all of our schools. It's important, even though it sometimes means you don't get your first choice of a school assignment this year." As I read the record, that is unlikely ever to happen more than once to any child white or black. What is the big deal?
Why is it so critical that we "get beyond race" in every possible way? Get beyond despising or disliking people because of their race, yes. Get beyond oppressing people because of their race, yes. But avoiding any consideration of race as though it were toxic? I don't understand that.
The court's decision is everything conservatives should abhor. It is a form of social engineering dictated from Washington. It ignores the principle of local control of schools. It sets aside the judgment of elected officials, even though nothing in the text of the Constitution requires that result, and the original understanding at the time of drafting of the 14th Amendment is solidly against it. It equates the well-intentioned and inclusive programs supported by both white and black people in Louisville and Seattle with the whole grotesquerie of racially oppressive practices which came down, as Charles Black once said, in apostolic succession from slavery and the Black Codes.
The plurality opinion is elegantly reasoned and reads as if it could have been written by a law review president. But it fails the very first lesson taught to preschoolers who watch Sesame Street: "Which of These Things Is Not Like the Others?"
It's been a pleasure talking with you both.
Friday, June 29
In October 2003, the nonpartisan Program on International Policy Attitudes published a study titled “Misperceptions, the media and the Iraq war.” It found that 60 percent of Americans believed at least one of the following: clear evidence had been found of links between Iraq and Al Qaeda; W.M.D. had been found in Iraq; world public opinion favored the U.S. going to war with Iraq.
The prevalence of these misperceptions, however, depended crucially on where people got their news. Only 23 percent of those who got their information mainly from PBS or NPR believed any of these untrue things, but the number was 80 percent among those relying primarily on Fox News. In particular, two-thirds of Fox devotees believed that the U.S. had “found clear evidence in Iraq that Saddam Hussein was working closely with the Al Qaeda terrorist organization.”
So, does anyone think it’s O.K. if Rupert Murdoch’s News Corporation, which owns Fox News, buys The Wall Street Journal?
The problem with Mr. Murdoch isn’t that he’s a right-wing ideologue. If that were all he was, he’d be much less dangerous. What he is, rather, is an opportunist who exploits a rule-free media environment — one created, in part, by conservative political power — by slanting news coverage to favor whoever he thinks will serve his business interests.
In the United States, that strategy has mainly meant blatant bias in favor of the Bush administration and the Republican Party — but last year Mr. Murdoch covered his bases by hosting a fund-raiser for Hillary Clinton’s Senate re-election campaign.
In Britain, Mr. Murdoch endorsed Tony Blair in 1997 and gave his government favorable coverage, “ensuring,” reports The New York Times, “that the new government would allow him to keep intact his British holdings.”
And in China, Mr. Murdoch’s organizations have taken care not to offend the dictatorship.
Now, Mr. Murdoch’s people rarely make flatly false claims. Instead, they usually convey misinformation through innuendo. During the early months of the Iraq occupation, for example, Fox gave breathless coverage to each report of possible W.M.D.’s, with little or no coverage of the subsequent discovery that it was a false alarm. No wonder, then, that many Fox viewers got the impression that W.M.D.’s had been found.
When all else fails, Mr. Murdoch’s news organizations simply stop covering inconvenient subjects.
Last year, Fox relentlessly pushed claims that the “liberal media” were failing to report the “good news” from Iraq. Once that line became untenable — well, the Project for Excellence in Journalism found that in the first quarter of 2007 daytime programs on Fox News devoted only 6 percent of their time to the Iraq war, compared with 18 percent at MSNBC and 20 percent at CNN.
What took Iraq’s place? Anna Nicole Smith, who received 17 percent of Fox’s daytime coverage.
Defenders of Mr. Murdoch’s bid for The Journal say that we should judge him not by Fox News but by his stewardship of the venerable Times of London, which he acquired in 1981. Indeed, the political bias of The Times is much less blatant than that of Fox News. But a number of former Times employees have said that there was pressure to slant coverage — and everyone I’ve seen quoted defending Mr. Murdoch’s management is still on his payroll.
In any case, do we want to see one of America’s two serious national newspapers in the hands of a man who has done so much to mislead so many? (The Washington Post, for all its influence, is basically a Beltway paper, not a national one. The McClatchy papers, though their Washington bureau’s reporting in the run-up to Iraq put more prestigious news organizations to shame, still don’t have The Journal’s ability to drive national discussion.)
There doesn’t seem to be any legal obstacle to the News Corporation’s bid for The Journal: F.C.C. rules on media ownership are mainly designed to prevent monopoly in local markets, not to safeguard precious national informational assets. Still, public pressure could help avert a Murdoch takeover. Maybe Congress should hold hearings.
If Mr. Murdoch does acquire The Journal, it will be a dark day for America’s news media — and American democracy. If there were any justice in the world, Mr. Murdoch, who did more than anyone in the news business to mislead this country into an unjustified, disastrous war, would be a discredited outcast. Instead, he’s expanding his empire.
Saturday, June 23
When you watch a baby being born, after a difficult pregnancy, it is so painful and bloody for the mother it is always hard to tell the truth and say, “Gosh, that baby is really ugly.” But that’s how I feel about the energy legislation passed (and not passed) by the Senate last week.
The whole Senate energy effort only reinforced my feelings that we’re in a green bubble — a festival of hot air by the news media, corporate America and presidential candidates about green this and green that, but, when it comes to actually doing something hard to bring about a green revolution at scale — and if you don’t have scale on this you have nothing — we wimp out. Climate change is not a hoax. The hoax is that we are really doing something about it.
No question, it’s great news that the Democrat-led Senate finally stood up to the automakers, and to the Michigan senators, and said, “No more — no more assisted suicide of the U.S. auto industry by the U.S. Congress. We’re passing the first bill since 1975 that mandates an increase in fuel economy.” If the Senate bill, which now has to go through the House, becomes law, automakers will have to boost the average mileage of new cars and light trucks to 35 miles per gallon by 2020, compared with about 25 miles per gallon today.
But before you celebrate, pay attention to some fine print in the Senate bill. If the Transportation Department determines that the fuel economy goal for any given year is not “cost-effective” — that is, too expensive for the car companies to meet — it can ease the standard. That loophole has to be tightened by the House, which takes up this legislation next week.
But even this new mileage standard is not exactly world leading. The European Union is today where we want to be in 2020, around 35 miles per gallon, and it is committed to going well over 40 m.p.g. by 2012. Ditto Japan.
There are other things that make the Senate energy effort ugly. Senate Republicans killed a proposed national renewable electricity mandate that would have required utilities to produce 15 percent of their power from wind, solar, biomass and other clean-energy sources by 2020. Twenty-three states already have such mandates. No matter. Making it national was too much for the Republicans.
And the Senate, thanks again to the Republicans, also squashed a Democratic proposal to boost taxes on oil and gas companies that would have raised some $32 billion for alternative fuel projects.
Despite all the new research on climate change, the Senate didn’t even touch the idea of either a cap-and-trade system or a carbon tax to limit carbon dioxide emissions. An effort by Senator Amy Klobuchar of Minnesota to legislate a national reporting (“carbon counter”) system to simply measure all sources of greenhouse gas emissions, which would enable a cap-and-trade system to work if we ever passed one, also got killed by Republicans. We can’t cap and trade something we can’t measure.
Here is the truth: the core of our energy crisis is in Washington. We have all the technology we need right now to make huge inroads in becoming more energy efficient and energy independent, with drastically lower emissions. We have all the capital we need as well. But because of the unique nature of the energy and climate-change issues — which require incentives and regulations to build alternatives to dirty, but cheap, fossil fuels — you need public policy to connect the energy and capital the right way. That is what has been missing.
“We have to work to ensure that the House will at least toughen the provisions that the Senate passed,” said Dan Becker, director of the Sierra Club’s Global Warming Program.
The public wants it. But energy policy gets shaped in the halls of Congress — where wily lobbyists, legacy industries and politicians greedy for campaign contributions regularly sell out the country’s interests for their own. Only when the public really rises up — as it has finally done against the auto companies — do we even get moderate change. Don’t look to the Bush team to lead the revolution.
“We are the only major country in the world where no one even knows the name of the environment minister — the head of our Environmental Protection Agency,” said Representative Edward Markey, a Massachusetts Democrat. “Whoever it is — and most people don’t even know if it is a he or a she — has been in a six-year witness protection program. Until the Democrats took over, no Bush E.P.A. administrator appeared before the House committee in charge of energy and climate change.”
Folks, we’re home alone. So call your House member — especially the Republicans. If you don’t, some lobbyist will.
BY this late date we should know the fix is in when the White House's top factotums fan out on the Sunday morning talk shows singing the same lyrics, often verbatim, from the same hymnal of spin. The pattern was set way back on Sept. 8, 2002, when in simultaneous appearances three cabinet members and the vice president warned darkly of Saddam's aluminum tubes. "We don't want the smoking gun to be a mushroom cloud," said Condi Rice, in a scripted line. The hard sell of the war in Iraq — the hyping of a (fictional) nuclear threat to America — had officially begun.
America wasn't paying close enough attention then. We can't afford to repeat that blunder now. Last weekend the latest custodians of the fiasco, our new commander in Iraq, Gen. David Petraeus, and our new ambassador to Baghdad, Ryan Crocker, took to the Sunday shows with two messages we'd be wise to heed.
The first was a confirmation of recent White House hints that the long-promised September pivot point for judging the success of the "surge" was inoperative. That deadline had been asserted as recently as April 24 by President Bush, who told Charlie Rose that September was when we'd have "a pretty good feel" whether his policy "made sense." On Sunday General Petraeus and Mr. Crocker each downgraded September to merely a "snapshot" of progress in Iraq. "Snapshot," of course, means "Never mind!"
The second message was more encoded and more ominous. Again using similar language, the two men said that in September they would explain what Mr. Crocker called "the consequences" and General Petraeus "the implications" of any alternative "courses of action" to their own course in Iraq. What this means in English is that when the September "snapshot" of the surge shows little change in the overall picture, the White House will say that "the consequences" of winding down the war would be even more disastrous: surrender, defeat, apocalypse now. So we must stay the surge. Like the war's rollout in 2002, the new propaganda offensive to extend and escalate the war will be exquisitely timed to both the anniversary of 9/11 and a high-stakes Congressional vote (the Pentagon appropriations bill).
General Petraeus and Mr. Crocker wouldn't be sounding like the Bobbsey Twins and laying out this coordinated rhetorical groundwork were they not already anticipating the surge's failure. Both spoke on Sunday of how (in General Petraeus's variation on the theme) they had to "show that the Baghdad clock can indeed move a bit faster, so that you can put a bit of time back on the Washington clock." The very premise is nonsense. Yes, there is a Washington clock, tied to Republicans' desire to avoid another Democratic surge on Election Day 2008. But there is no Baghdad clock. It was blown up long ago and is being no more successfully reconstructed than anything else in Iraq.
When Mr. Bush announced his "new way forward" in January, he offered a bouquet of promises, all unfulfilled today. "Let the Iraqis lead" was the policy's first bullet point, but in the initial assault on insurgents now playing out so lethally in Diyala Province, Iraqi forces were kept out of the fighting altogether. They were added on Thursday: 500 Iraqis, following 2,500 Americans. The notion that these Shiite troops might "hold" this Sunni area once the Americans leave is an opium dream. We're already back fighting in Maysan, a province whose security was officially turned over to Iraqi authorities in April.
In his January prime-time speech announcing the surge, Mr. Bush also said that "America will hold the Iraqi government to the benchmarks it has announced." More fiction. Prime Minister Nuri al-Maliki's own political adviser, Sadiq al-Rikabi, says it would take "a miracle" to pass the legislation America wants. Asked on Monday whether the Iraqi Parliament would stay in Baghdad this summer rather than hightail it to vacation, Tony Snow was stumped.
Like Mr. Crocker and General Petraeus, Mr. Snow is on script for trivializing September as judgment day for the surge, saying that by then we'll only "have a little bit of metric" to measure success. This administration has a peculiar metric system. On Thursday, Peter Pace, the departing chairman of the Joint Chiefs of Staff, called the spike in American troop deaths last week the "wrong metric" for assessing the surge's progress. No doubt other metrics in official reports this month are worthless too, as far as the non-reality-based White House is concerned. The civilian casualty rate is at an all-time high; the April-May American death toll is a new two-month record; overall violence in Iraq is up; only 146 out of 457 Baghdad neighborhoods are secure; the number of internally displaced Iraqis has quadrupled since January.
Last week Iraq rose to No. 2 in Foreign Policy magazine's Failed State Index, barely nosing out Sudan. It might have made No. 1 if the Iraqi health ministry had not stopped providing a count of civilian casualties. Or if the Pentagon were not withholding statistics on the increase of attacks on the Green Zone. Apparently the White House is working overtime to ensure that the September "snapshot" of Iraq will be an underexposed blur. David Carr of The Times discovered that the severe Pentagon blackout on images of casualties now extends to memorials for the fallen in Iraq, even when a unit invites press coverage.
Americans and Iraqis know the truth anyway. The question now is: What will be the new new way forward? For the administration, the way forward will include, as always, attacks on its critics' patriotism. We got a particularly absurd taste of that this month when Harry Reid was slammed for calling General Pace incompetent and accusing General Petraeus of exaggerating progress on the ground.
General Pace's record speaks for itself; the administration declined to go to the mat in the Senate for his reappointment. As for General Petraeus, who recently spoke of "astonishing signs of normalcy" in Baghdad, he is nothing if not consistent. He first hyped "optimism" and "momentum" in Iraq in an op-ed article in September 2004.
Come September 2007, Mr. Bush will offer his usual false choices. We must either stay his disastrous course in eternal pursuit of "victory" or retreat to the apocalypse of "precipitous withdrawal." But by the latest of the president's ever-shifting definitions of victory, we've already lost. "Victory will come," he says, when Iraq "is stable enough to be able to be an ally in the war on terror and to govern itself and defend itself." The surge, which he advertised as providing "breathing space" for the Iraqi "unity" government to get its act together, is tipping that government into collapse. As Vali Nasr, author of "The Shia Revival," has said, the new American strategy of arming Sunni tribes is tantamount to saying the Iraqi government is irrelevant.
For the Bush White House, the real definition of victory has become "anything they can get away with without taking blame for defeat," said the retired Army Gen. William Odom, a national security official in the Reagan and Carter administrations, when I spoke with him recently. The plan is to run out the Washington clock between now and Jan. 20, 2009, no matter the cost.
Precipitous withdrawal is also a chimera, since American manpower, materiel and bases, not to mention our new Vatican City-sized embassy, can't be drawn down overnight. The only real choice, as everyone knows, is an orderly plan for withdrawal that will best serve American interests. The real debate must be over what that plan is. That debate can't happen as long as the White House gets away with falsifying reality, sliming its opponents and sowing hyped fears of Armageddon. The threat that terrorists in civil-war-torn Iraq will follow us home if we leave is as bogus as Saddam's mushroom clouds. The Qaeda that actually attacked us on 9/11 still remains under the tacit protection of our ally, Pakistan.
As General Odom says, the endgame will start "when a senior senator from the president's party says no," much as William Fulbright did to L.B.J. during Vietnam. That's why in Washington this fall, eyes will turn once again to John Warner, the senior Republican with the clout to give political cover to other members of his party who want to leave Iraq before they're forced to evacuate Congress. In September, it will be nearly a year since Mr. Warner said that Iraq was "drifting sideways" and that action would have to be taken "if this level of violence is not under control and this government able to function."
Mr. Warner has also signaled his regret that he was not more outspoken during Vietnam. "We kept surging in those years," he told The Washington Post in January, as the Iraq surge began. "It didn't work." Surely he must recognize that his moment for speaking out about this war is overdue. Without him, the Democrats don't have the votes to force the president's hand. With him, it's a slam dunk. The best way to honor the sixth anniversary of 9/11 will be to at last disarm a president who continues to squander countless lives in the names of those voiceless American dead.
Friday, June 22
By Bruce Reed
Thursday, June 21, 2007
If you think you've had a long week, be glad you're not Rudy Giuliani. On Tuesday, his top Iowa adviser left to become Bush's OMB director. He had to dump his South Carolina campaign chair, who was charged with cocaine possession and distribution. But for Giuliani, those headaches paled alongside the week's most excruciating spectacle: seeing his successor, Michael Bloomberg, grace the cover of Time and leave the GOP to plot an independent bid for President. Even if Bloomberg ultimately decides not to run, Giuliani may already be the Bloomberg campaign's first victim.
For Giuliani, the Bloomberg boomlet is bad news on every level. First, Bloomberg joins Fred Thompson in sucking up much of the oxygen that Giuliani's campaign needs to keep breathing. In most national and statewide polls, Giuliani's lead is slipping or has disappeared altogether. While Bloomberg explores how many billions it might take to buy an election, Giuliani suddenly finds himself in no-man's land, as a frontrunner who can't buy a headline.
On Wednesday, Giuliani gave a speech detailing the first of his "12 Commitments." Granted, no one should make too much of a commitment ceremony with Rudy Giuliani. But the plan he offered on fiscal discipline wasn't bad. The national press chose to write another day of stories about Bloomberg.
The second burden is personal. Giuliani is famously selfish about sharing the limelight. He once fired his police chief William Bratton for appearing on the cover of Time. Giuliani's attitude was, "That's my job!" Now a man he thinks he picked for mayor has done it again. Far from firing him, Giuliani has to sit there and read all about it.
Most speculation about Giuliani and Bloomberg has focused on the general election, and the marquee prospect of a Subway Series between two New York mayors and a New York senator. But for Giuliani, the real threat Bloomberg may pose is in the primaries.
Unlike most presidential candidates, who tend to embellish their hometown roots, Giuliani's campaign depends on making Republican-primary voters forget every aspect of his past except 9/11. His Web site calls him "a strong supporter of the Second Amendment," not a Brady-billing assault-weapon banner. He's not from the "abortion capital of the world"; he's for parental notification and decreasing abortions. Gay rights? He's such a traditionalist, his record boasts more straight marriages than any other candidate.
Giuliani's Escape from New York was already tough enough, but Mayor Mike makes it nearly impossible. Bloomberg is the Ghost of Rudy Past—a constant, high-profile reminder of the cultural distance from the South Carolina lowlands to the New York island.
When Bloomberg launched his gun-control crusade, he gave it a name that sounds like the headline from a GOP rival campaign's oppo piece on Giuliani: "Mayors Against Illegal Guns." For conservatives, the same accomplishments the national media loves about Bloomberg are the first signs of the Trilateralist Apocalypse: From penthouses in Manhattan, they'll come for your guns; then they'll snuff your tobacco; and in a final blow to life, liberty, and the pursuit of happiness, they'll take away your God-given right to trans-fats.
Mitt Romney looks disingenuous enough pretending that he saw Massachusetts and tried to stop it. Giuliani has no excuse. His last act as New York City mayor was to urge his people to elect Bloomberg to succeed him. Watching Bloomberg quit the party only reminds conservatives of their primal fear about Giuliani—that the GOP is not an article of faith but a way station of convenience.
You can take the mayor out of the city, but you can't take the city out of the mayor. The more coverage Bloomberg gets, the more his allies will compound the impression that one Hizzoner looks like another. In yesterday's Washington Post, Al Sharpton described Bloomberg with one of those only-in-New-York images:
"A girl in high school catches you looking at her and she starts wearing nice dresses," Sharpton says. "It doesn't mean she is going to date you. But she's at least teasing you, so it really increases your hope. This is a serious tease."
Sharpton just confirmed what they already thought down in South Carolina: Every New York mayor's a cross-dresser.
Sunday, June 17
This is not a joke anyone would think to make up these days. The absentminded professor, that kindly old figure, is long gone. A new image has taken his place, one that bespeaks not only our culture’s hostility to the mind, but also its desperate confusion about the nature of love
Look at recent movies about academics, and a remarkably consistent pattern emerges. In The Squid and the Whale (2005), Jeff Daniels plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In One True Thing (1998), William Hurt plays an English professor and failed writer who sleeps with his students, neglects his wife, and bullies his children. In Wonder Boys (2000), Michael Douglas plays an English professor and failed writer who sleeps with his students, has just been left by his third wife, and can’t commit to the child he’s conceived in an adulterous affair with his chancellor. Daniels’s character is vain, selfish, resentful, and immature. Hurt’s is vain, selfish, pompous, and self-pitying. Douglas’s is vain, selfish, resentful, and self-pitying. Hurt’s character drinks. Douglas’s drinks, smokes pot, and takes pills. All three men measure themselves against successful writers (two of them, in Douglas’s case; his own wife, in Daniels’s) whose presence diminishes them further. In We Don’t Live Here Anymore (2004), Mark Ruffalo and Peter Krause divide the central role: both are English professors, and both neglect and cheat on their wives, but Krause plays the arrogant, priapic writer who seduces his students, Ruffalo the passive, self-pitying failure. A Love Song For Bobby Long (2004) divides the stereotype a different way, with John Travolta as the washed-up, alcoholic English professor, Gabriel Macht as the blocked, alcoholic writer.
Not that these figures always teach English. Kevin Spacey plays a philosophy professor — broken, bitter, dissolute — in The Life of David Gale (2003). Steve Carell plays a self-loathing, suicidal Proust scholar in Little Miss Sunshine (2006). Both characters fall for graduate students, with disastrous results. And while the stereotype has gained a new prominence of late, its roots go back at least a few decades. Many of its elements are in place in Oleanna (1994), in Surviving Desire (1991), and, with John Mahoney’s burnt-out communications professor, in Moonstruck (1987). In fact, all of its elements are in place in Terms of Endearment (1983), where Jeff Daniels took his first turn playing a feckless, philandering English professor. And of course, almost two decades before that, there was Who’s Afraid of Virginia Woolf?
What’s going on here? If the image of the absent-minded professor stood for benevolent unworldliness, what is the meaning of the new academic stereotype? Why are so many of these failed professors also failed writers? Why is professional futility so often connected with sexual impropriety? (In both Terms of Endearment and We Don’t Live Here Anymore, “going to the library” becomes a euphemism for “going to sleep with a student.”) Why are these professors all men, and why are all the ones who are married such miserable husbands?
The answers can be found in the way these movies typically unfold. Consider One True Thing, in which these questions are most fully and intelligently played out. As the movie opens, Hurt’s George Gulden comes across as a monumental figure. Seen through the eyes of his daughter, Ellen, from whose perspective the story unfolds, George embodies the highest intellectual and ethical standards: brilliant, passionate, demanding, a gifted critic and beloved teacher, a dispenser of anecdotes and aphorisms that suggest a near converse with the gods. Ellen, an ambitious young journalist, has worshiped him since she was a little girl — emulating him, yearning for his hard-won approval, and disdaining her less-educated mother, Kate, as trivial and weak. Kate belongs to a group of local wives who devote themselves to performing acts that seem utterly inconsequential and who, as if to advertise their own insignificance, call themselves the “Minnies.” But when George summons Ellen home to care for her dying mother — or, as it turns out, to care for him in his wife’s stead — his daughter gradually comes to see her parents for what they really are. George is a novelist manqué who recycles his stories, plagiarizes his witticisms, and drinks away his sorrows in secret (he no longer even has the starch to chase graduate students). His wife is really the strong one. While George and his kind dream their petty dreams of glory, the Minnies hold the community together. One day, Kate forces Ellen on an excruciating drive during which Kate and another woman sing silly songs at the top of their lungs. Afterward, Kate explains to Ellen that the woman has been living as a virtual shut-in since her husband left her, so the Minnies have been taking turns getting her out of the house. Ellen learns that just as the Minnies have held the community together, her mother has held the family together — held it together, it turns out, until her death. The “one true thing,” Ellen realizes, is not intellect or ambition, as she’d been taught to believe, but love.
The lesson is typical in these films and points to the meaning of the new academic stereotype. The alcoholic, embittered, writer-manqué English professor who neglects his family and seduces his students is a figure of creative sterility, and he is creatively sterile because he loves only himself. Hence his vanity, pomposity, and selfishness; his self-pity, passivity, and resentment. Hence his ambition and failure. And thence his lechery, for sleeping with his students is a sign not of virility but of impotence: he can only hit the easy targets; he feeds on his students’ vitality; he can’t succeed in growing up. Other symbolic emasculations abound. John Travolta stumbles around in a bathrobe. Michael Douglas stumbles around in a pink one. Steve Carell’s character is gay. But most importantly, nearly all of them are set against a much woman, usually a wife, whose power lies precisely in her ability to love: to sacrifice, to empathize, to connect. By the end of the movie, in the typical case, the academic, too, has learned to love and, having been humbled as thoroughly as Rochester in Jane Eyre, is equally ready for redemptive female ministration.
There are several things to note about all this. First, while the new stereotype is akin to the political/journalistic image of the academy as a bastion of effete, liberal, eggheaded snobs, its emphasis is different. The liberalism, which in the news media is central, is generally absent (we almost never learn anything about movie professors’ political beliefs), while the effeteness is central. Elitism and intellectualism are downplayed, the first usually manifesting as personal arrogance rather than as a wider cultural attitude, the second invariably expressed in the movie shorthand of quoting famous authors. Second, the new stereotype is not confined to film. Most of the dozen movies I’ve been considering were adapted from novels, short stories, or plays. Other partial examples include Saul Bellow’s Herzog, Philip Roth’s Kepesh books, and Wallace Stegner’s last novel, Crossing to Safety. Zadie Smith’s On Beauty is a full example, as are many other works from the burgeoning genre of campus fiction. Richard Powers shows how reflexive the image has become with his glance in The Gold Bug Variations at the heroine’s “fully clothed grope with her thesis instructor, momentarily aroused for the first time since his tenure, when the two of them compared the relative merits of Volpone and As You Like It.”
Indeed, the new stereotype has its roots in literary examples that go back well over a century, most conspicuously to Casaubon in Middlemarch and to Mr. Ramsay in To the Lighthouse, both pompous, aging narcissists, the former creatively and sexually sterile but married to a passionate young beauty, the latter, though he has written many books and fathered eight children, sustained only by regular fertilizations by his wife’s maternal fecundity. One should also mention Hedda Gabler’s George Tesman and Uncle Vanya’s Serebryakoff, another pair of ponderous failures misallied to beautiful young women. But the sex of the authors of the two novels I just mentioned points to perhaps the most significant fact about the new academic stereotype and the narrative paradigm in which it is typically situated, which is that they are a way of articulating the superiority of female values to male ones: of love, community, and self-sacrifice to ambition, success, and fame.
So why are academics regarded as the most appropriate instrument for this lesson? Yes, there are any number of movies in which a high-powered lawyer or executive or even artist (male or female) learns that family and friendship are more important than money and success, but these figures are allowed to become rich and successful first, before discovering what really matters (and are allowed to hold on to their wealth and fame afterward). Only for academics is ambition as such reprehensible. Only for them is it self-defeating, even on its own terms. The explanation lies in another remarkable fact about the new stereotype (though it was also part of the old one): the representative academic is always a professor of humanities. The ones who aren’t English professors are professors of history or philosophy or art history or French. And this goes as much for the novels and plays I’ve mentioned as for the films. It seems that in the popular imagination, “professor” means “humanities professor.” Of course, there are plenty of science professors in movies and books, but they are understood as scientists, not professors. Social scientists are quoted liberally in the press, but generally under the rubric of “scholar” or “expert.” Stereotypes arise from the partitioning of complex realities — academics play multiple roles — into mutually isolated simplifications. Say the word professor, and the popular mind, now as in the old days, conjures up the image of a quotation-spouting bookworm. And it is that figure who has become an object lesson in the vanity of ambition.
In the popular imagination, humanities professors don’t have anything to be ambitious about. No one really knows what they do, and to the extent that people do know, they don’t think it’s worth doing — which is why, when the subject of humanistic study is exposed to public view, it is often ridiculed as trivial, arcane, or pointless. Other received ideas come into play here: “those who can’t do, teach”; the critic as eunuch or parasite; the ineffective intellectual; tenure as a system for enshrining mediocrity. It may be simply because academics don’t pursue wealth, power, or, to any real extent, fame that they are vulnerable to such accusations. In our culture, the willingness to settle for something less than these Luciferian goals is itself seen as emasculating. Academics are ambitious, but in a weak, pathetic way. This may also explain why they are uniquely open to the charge of passionlessness. No one expects a lawyer to be passionate about the law: he’s doing it for the money. No one expects a plumber to be passionate about pipes: he’s doing it to support his family. But a professor’s only excuse for doing something so trivial and accepting such paltry rewards for it is his love for the subject. If that’s gone, what remains? Nothing but baseless vanity and feeble ambition. Professors, in the popular imagination, are absurd little men puffing themselves up about nothing. It’s no wonder they need to be taught a lesson.
Still none of this explains why the new academic stereotype has emerged just now. he first possibility is that today’s academics are portrayed as pompous, lecherous, alcoholic failures because that’s what they are. In terms of some of the longer-lasting elements of the professorial image, this is no doubt true. Pedantry and elitism are inherent temptations in the academic enterprise, and Max Weber remarked nearly a century ago that, for professors, vanity is a sort of occupational disease. Precisely because they don’t possess the kind of wealth that accrues to doctors and lawyers or the status wealth confers, academics are more apt to parade their intellectual superiority than members of other elite professions. But professors have neither a monopoly on nor a disproportionate share of quiet desperation or the self-destructive gestures that attend it. Male professors are not less-devoted or less-faithful husbands, on average, than other men — in fact, relative to wealthier ones, they are probably more so. (That there are now a substantial number of female academics is a circumstance the popular imagination has yet to discover.)
The second possibility is that the current writers of screenplays and novels have a special animus against professors, especially English professors. Given the rumor that screenwriters are often former English majors or English graduate students and that novelists tend to have creative-writing appointments that put them in regular contact with English professors, that they sometimes are English professors, and that in any case they have particular reason, given the relation between artist and critic, to be suspicious of English professors, there may be something to this hypothesis.
But there are larger reasons for the rise of the new academic stereotype — reasons that are rooted in some of the changes that have come to American society and to the academy’s place within it over the last six decades, and especially over the last three or four. Americans’ traditional resentment of hierarchy and hostility toward intellect have intensified since World War II and particularly since the 1960s. Elites have been discredited, the notion of high culture dethroned, the means of communication decentralized. Public discourse has become more demotic; families, churches, and other institutions more democratic. The existence of academia, an institution predicated on intellectual hierarchy, irritates Americans’ insistence on equality, their feeling that intellect constitutes a contemptible kind of advantage. At the same time, as American society has become more meritocratic, its economy more technocratic, people want that advantage for themselves or their children. With the U.S. News rankings and the annual admissions frenzy, universities are playing an ever-more conspicuous role in creating the larger social hierarchy that no one acknowledges but everyone wants to climb. It’s no wonder that people resent the gatekeepers and enjoy seeing them symbolically humiliated.
The huge expansion of the college population in the decades after World War II also created a new professoriate. If academics once tended to be gentle, unworldly souls (or even if they were just seen that way), that’s because they could afford to be. Advancement within the profession depended to a great extent on a relatively small, informal, old-boy network. Modest pay meant that many academics came from the social elite and could rely on private incomes. But with the postwar boom in higher education, academia became a viable career for vast numbers of people from beyond the establishment: the bright, striving sons of the great unwashed. Later, with concerns about fair labor practices that followed the rights revolution, the whole system of hiring and promotion became formalized and regularized. Still more recently, the contraction of the college-age population and the casualization of academic labor — the move from permanent faculty positions to adjuncts, postdocs, and instructors — have created the job crunch of the last two decades. The old-boy network has given way to an unceasing scramble for position, and the kindly luftmensch has been displaced by the careerist parvenu. In today’s graduate programs, the watchword is professionalization; no one talks much about the life of the mind anymore. Of course, the old gentility rested on exclusion, and the new rat race is meritocracy in motion; but images aren’t necessarily fair. The new academic stereotype, with its emphasis on moral failure and the frustrations of petty ambition, registers this generational change.
But the one respect in which the new academic stereotype departs most radically from current reality — yet in so doing most fully reflects what’s been happening in American culture of late and most clearly reveals the current state of the American psyche — has to do with sex. As we’ve seen, one of the things nearly all professors in movies and novels have in common is that they sleep with their students. This is true even when the professor in question doesn’t otherwise conform to the new stereotype. In fact, lust is almost the only emotion that movie professors ever express toward their students. In the rare scenes in which these teachers actually teach, the point is to exhibit the classroom or office hour as a locus of sexual tension. The popular mind can’t seem to imagine what other kind of relationship, let alone what other kind of intimacy, a professor and student could share. And it certainly can’t imagine what other sort of gratification a person could derive from teaching in a university.
Why has this idea of universities as dens of vice, where creepy middle-aged men lie in wait for nubile young women, arisen in the last few decades? First, coeducation. Coed colleges have existed since the early 19th century, and large numbers of public universities, in particular, have been coed since late in that century. But the great wave of coeducation at the nation’s elite private schools, which take the lead in forming the public image of university life, did not hit until the late 1960s. At the same time, women were becoming an increasingly visible presence at schools that had already been coed. Another upheaval was under way by then, as well: the sexual revolution. Suddenly, professors had access to large numbers of young women, and just as suddenly, young women were asserting their sexuality with new freedom and boldness. People drew the inevitable conclusion. Since then, American culture has only become increasingly sexualized — which means, for the most part, that youth has become increasingly sexualized by the culture. Not coincidentally, concern about the sexual exploitation of children has reached the dimension of a moral panic. In the figure of the movie professor, Americans can vicariously enjoy the thought of close proximity to all that firm young flesh while simultaneously condemning the desire to enjoy it — the old Puritan dodge.
The situation is heightened and made ironic by two other recent developments. The famously overprotective parenting style of the baby-boom generation has put pressure on universities to revert to acting in loco parentis, forcing them to take on the paternalistic role the boomers rejected during their own college years. Professors are the surrogate parents that parents hand their children over to, and the raising and casting out of the specter of the sexually predatory academic may be a way of purging the anxiety that transaction evokes. But long before the baby boomers’ offspring started to reach college, the feminist campaign against sexual harassment — most effective in academia, the institution most responsive to feminist concerns — had turned universities into the most anxiously self-patrolled workplace in American society, especially when it comes to relations between professors and undergraduates. This is not to suggest that sexual contact between college students and professors, welcome or unwelcome, never takes place, but the belief that it is the norm is a product of fantasy, not fact.
Still, there is a reality behind the new, sexualized academic stereotype, only it is not what the larger society thinks. Nor is it one that society is equipped to understand. The relationship between professors and students can indeed be intensely intimate, as our culture nervously suspects, but its intimacy, when it occurs, is an intimacy of the mind. I would even go so far as to say that in many cases it is an intimacy of the soul. And so the professor-student relationship, at its best, raises two problems for the American imagination: it begins in the intellect, that suspect faculty, and it involves a form of love that is neither erotic nor familial, the only two forms our culture understands. Eros in the true sense is at the heart of the pedagogical relationship, but the professor isn’t the one who falls in love.
Love is a flame, and the good teacher raises in students a burning desire for his or her approval and attention, his or her voice and presence, that is erotic in its urgency and intensity. The professor ignites these feelings just by standing in front of a classroom talking about Shakespeare or anthropology or physics, but the fruits of the mind are that sweet, and intellect has the power to call forth new forces in the soul. Students will s ometimes mistake this earthquake for sexual attraction, and the foolish or inexperienced or cynical instructor will exploit that confusion for his or her own gratification. But the great majority of professors understand that the art of teaching consists not only of arousing desire but of redirecting it toward its proper object, from the teacher to the thing taught. Teaching, Yeats said, is lighting a fire, not filling a bucket, and this is how it gets lit. The professor becomes the student’s muse, the figure to whom the labors of the semester — the studying, the speaking in class, the writing — are consecrated. The alert student understands this. In talking to one of my teaching assistants about these matters, I asked her if she’d ever had a crush on an instructor when she was in college. Yes, she said, a young graduate student. “And did you want to have sex with him?” I asked. “No,” she said, “I wanted to have brain sex with him.”
I’m not saying anything new here. All of this was known to Socrates, the greatest of teachers, and laid out in the Symposium, Plato’s dramatization of his mentor’s erotic pedagogy. We are all “pregnant in soul,” Socrates tells his companions, and we are drawn to beautiful souls because they make us teem with thoughts that beg to be brought into the world. The imagery seems contradictory: are we pregnant already, or does the proximity of beautiful souls make us so? Both: the true teacher helps us discover things we already knew, only we didn’t know we knew them. The imagery is also deliberately sexual. The Symposium, in which the brightest wits of Athens spend the night drinking, discoursing on love, and lying on couches two by two, is charged with sexual tension. But Socrates wants to teach his companions that the beauty of souls is greater than the beauty of bodies.
And just as he finishes educing this idea, in walks Alcibiades, the most beautiful young man in the city. Alcibiades was the brilliant bad boy of late fifth-century B.C. Athenian politics, a cross between Jack Kennedy and Jimmy Dean, and Socrates must have known that he was the most interesting student he would ever meet, because Socrates’ love for him was legendary. But it wasn’t the kind his beloved imagined, and Alcibiades complains about how the older man, after bewitching him with divine conversation, would refuse to touch him. The sexy young student had fallen, to his amazement, for the ugly old teacher. At last, Alcibiades tells us, he contrived to get Socrates alone — let’s call this office hours — only to discover that all his teacher wanted to do was engage in more conversation. The “eros of souls,” in Alan Bloom’s Platonic phrase — “brain sex,” in plainer language — is not only higher than the eros of bodies, it is more satisfying.
Can there be a culture less equipped than ours to receive these ideas? Sex is the god we worship most fervently; to deny that it is the greatest of pleasures is to commit cultural blasphemy. In any case, how can you have an eros of souls if you don’t have souls? Our inability to understand intimacy that is neither sexual nor familial is linked to the impoverishment of our spiritual vocabulary. Religion still speaks of the soul, but to the popular mind, at least, it means something remote from our earthly self. What it should mean is the self, the heart and mind, or the heart-mind, as it develops through experience. That’s what Keats meant when he called the world a “vale of soul-making.” And because we’re unequipped to understand the soul in this sense, we’re unequipped to understand Socrates’ belief that the soul’s offspring are greater than the body’s: that ideas are more valuable than children.
Another blasphemy. If there’s one god our culture worships as piously as sex, it’s children. But sex and children, sexual intimacy and familial intimacy, have something in common — beyond the fact that one leads to the other: both belong to us as creatures of nature, not as creators in culture. After Rousseau and Darwin and Freud, and with evolutionary psychology preaching the new moral gospel, we’ve become convinced that our natural self is our truest one. To be natural, we believe, is to be healthy and free. Culture is confinement and deformation. But the Greeks thought otherwise. To them, our highest good is not what we share with the animals, but what we don’t share with them, not the nature we’re born with, but the culture we make from it — make, indeed, against it.
That is why, for the Greeks, the teacher’s relationship with the child was regarded as more valuable and more intimate than the parents’. Your parents bring you into nature, but your teacher brings you into culture. Natural transmission is easy; any animal can do it. Cultural transmission is hard; it takes a teacher. But Socrates also inaugurated a new idea about what teaching means. His students had already been educated into their culture by the time they got to him. He wanted to educate them out of it, teach them to question its values. His teaching wasn’t cultural, it was counter-cultural. The Athenians understood Socrates very well when they convicted him of corrupting their youth, and if today’s parents are worried about trusting their children to professors, this countercultural possibility is really what they should be worried about. Teaching, as Neil Postman says, is a subversive activity — all the more so today, when children are marinated in cultural messages from the moment they’re born. It no longer takes any training to learn to bow to your city’s gods (sex or children, money or nation). But it often takes a teacher to help you question those gods. The teacher’s job, in Keats’s terms, is to point you through the vale of soul-making. We’re born once, into nature and into the culture that quickly becomes a second nature. But then, if we’re granted such grace, we’re born again. For what does it profit a man if he gains the whole world and loses his mortal soul?
This is the kind of sex professors are having with their students behind closed doors: brain sex. And this is why we put up with the mediocre pay and the cultural contempt, not to mention the myriad indignities of graduate school and the tenure process. I know perfectly well that not every professor or every student feels this way or acts this way, nor does every university make it possible for them to do so. There are hacks and prima donnas at the front of many classrooms, slackers and zombies in the seats. And it doesn’t matter who’s in either position if the instructor is teaching four classes at three different campuses or if there are 500 people in the lecture hall. But there are far more true teachers and far more true students at all levels of the university system than those at its top echelons like to believe. In fact, kids who have had fewer educational advantages before they get to college are often more eager to learn and more ready to have their deepest convictions overturned than their more fortunate peers. And it is often away from the elite schools — where a single-minded focus on research plus a talent for bureaucratic maneuvering are the necessary tickets to success — that true teaching most flourishes.
What attracts professors to students, then, is not their bodies but their souls. Young people are still curious about ideas, still believe in them — in their importance, their redemptive power. Socrates says in the Symposium that the hardest thing about being ignorant is that you’re content with yourself, but for many kids when they get to college, this is not yet true. They recognize themselves as incomplete, and they recognize, if only intuitively, that completion comes through eros. So they seek out professors with whom to have relationships, and we seek them out in turn. Teaching, finally, is about relationships. It is mentorship, not instruction. Socrates also says that the bond between teacher and student lasts a lifetime, even when the two are no longer together. And so it is. Student succeeds student, and I know that even the ones I’m closest to now will soon become names in my address book and then just distant memories. But the feelings we have for the teachers or students who have meant the most to us, like those we have for long-lost friends, never go away. They are part of us, and the briefest thought revives them, and we know that in some heaven we will all meet again.
The truth is that these possibilities are not quite as alien to American culture as I’ve been making out. Along with the new stereotype that’s dominated the portrayal of academics in film and fiction in recent years has come, far less frequently, a different image of what a college teacher can be and mean, exactly along the lines I’ve been tracing. It is there in Julia Roberts’s character in Mona Lisa Smile, in the blind professor who teaches Cameron Diaz’s character to love poetry in In Her Shoes, and most obviously, in Tuesdays with Morrie, that gargantuan cultural phenomenon. Robin Williams offered a scholastic version in Dead Poets Society. But we seem to need to keep the idea, or at least the person who embodies it, at a safe distance. Both Mona Lisa Smile and Dead Poets Society take place in the 1950s and at single-sex schools. Cameron Diaz’s mentor and Morrie Schwartz are retired and dying. The Socratic relationship is so profoundly disturbing to our culture that it must be defused before it can be approached. Yet many thousands of kids go off to college every year hoping, at least dimly, to experience it. It has become a kind of suppressed cultural memory, a haunting imaginative possibility. In our sex-stupefied, anti-intellectual culture, the eros of souls has become the love that dares not speak its name.
William Deresiewicz teaches at Yale.
AS a weary nation awaited the fade-out of "The Sopranos" last Sunday, the widow of the actual Mafia don John Gotti visited his tomb in Queens to observe the fifth anniversary of his death. Victoria Gotti was not pleased to find reporters lying in wait.
"It's disgusting that people are still obsessed with Gotti and the mob," she told The Daily News. "They should be obsessed with that mob in Washington. They have 3,000 deaths on their hands." She demanded to know if the president and vice president have relatives on the front lines. "Every time I watch the news and I hear of another death," she said, "it sickens me."
Far be it from me to cross any member of the Gotti family, but there's nothing wrong with being obsessed with both mobs. Now that the approval rating for the entire Washington franchise, the president and Congress alike, has plummeted into the 20s, we need any distraction we can get; the Mafia is a welcome nostalgic escape from a gridlocked government at home and epic violence abroad.
But unlikely moral arbiter that Mrs. Gotti may be, she does have a point. As the Iraq war careens toward a denouement as black, unresolved and terrifying as David Chase's inspired "Sopranos" finale, the mob in the capital deserves at least equal attention. John Gotti, the last don, is dead. Mr. Chase's series is over. But the deaths on the nightly news are coming as fast as ever.
True, the Washington mob isn't as sexy as the Gotti or Soprano clans, but there is now a gripping nonfiction dramatization of its machinations available gratis on the Internet, no HBO subscription required. For this we can thank U.S. District Judge Reggie Walton, who presided over the Scooter Libby trial. Judge Walton's greatest move was not the 30-month sentence he gave Mr. Libby, a fall guy for higher-ups (and certain to be pardoned to protect their secrets). It was instead the judge's decision to make public the testimonials written to the court by members of the Washington establishment pleading that a criminal convicted on four felony counts be set free.
Mr. Libby's lawyers argued that these letters should remain locked away on the hilarious grounds that they might be "discussed, even mocked, by bloggers." And apparently many of the correspondents assumed that their missives would remain private, just like all other documents pertaining to Mr. Libby's former boss, Dick Cheney. The result is very little self-censorship among the authors and an epistolary gold mine for readers.
Among those contributing to the 373 pages of what thesmokinggun.com calls "Scooter Libby Love Letters" are self-identified liberals and Democrats, a few journalists (including a contributing writer to The New York Times Magazine) and a goodly sample of those who presided over the Iraq catastrophe or cheered it on. This is a documentary snapshot of the elite Washington mob of our time.
Like the scripts for "The Sopranos," the letters are not without mordant laughs. Henry Kissinger writes a perfunctory two paragraphs, of which the one about Mr. Libby rather than himself seems an afterthought. James Carville co-signs a letter by Mary Matalin tediously detailing Mr. Libby's devotion to organizing trick-or-treat festivities for administration children spending a post-9/11 Halloween at an "undisclosed location." One correspondent writes in astonishment that Mr. Libby once helped "a neighbor who is a staunch Democrat" dig his car out of the snow, and another is in awe that Mr. Libby would "personally buy his son a gift rather than passing the task on to his wife." Many praise Mr. Libby's novel, "The Apprentice," apparently on the principle that an overwritten slab of published fiction might legitimize the short stories he fabricated freelance for a grand jury.
But what makes these letters rise above inanity is the portrait they provide of a wartime capital cut adrift from moral bearings. As the political historian Rick Perlstein has written, one of the recurrent themes of these pleas for mercy is that Mr. Libby perjured himself "only because he was so busy protecting us from Armageddon." Has there ever been a government leader convicted of a crime — and I don't mean only Americans — who didn't see himself as saving the world from the enemy?
The Libby supporters never acknowledge the undisputed fact that their hero, a lawyer by profession, leaked classified information about a covert C.I.A. officer. And that he did so not accidentally but to try to silence an administration critic who called attention to the White House's prewar lies about W.M.D. intelligence. And that he compounded the original lies by lying repeatedly to investigators pursuing an inquiry that without his interference might have nailed others now known to have also leaked Valerie Wilson's identity (Richard Armitage, Karl Rove, Ari Fleischer).
Much has been said about the hypocrisy of those on the right, champions both of Bill Clinton's impeachment and of unflinching immigration enforcement, who call for legal amnesty in Mr. Libby's case. To thicken their exquisite bind, these selective sticklers for strict justice have been foiled in their usual drill of attacking the judge in the case as "liberal." Judge Walton was initially appointed to the bench by Ronald Reagan and was elevated to his present job by the current President Bush; he was assigned as well to the Foreign Intelligence Surveillance Court by the Bush-appointed chief justice, John Roberts. Such credentials notwithstanding, Judge Walton told the court on Thursday that he was alarmed by new correspondence and phone calls from the Libby mob since the sentencing "wishing bad things" on him and his family.
In Washington, however, hypocrisy is a perennial crime in both parties; if all the city's hypocrites were put in jail, there would be no one left to run the government. What is more striking about the Libby love letters is how nearly all of them ignore the reality that the crime of lying under oath is at the heart of the case. That issue simply isn't on these letter writers' radar screen; the criminal act of perjury isn't addressed (unless it's ascribed to memory loss because Mr. Libby was so darn busy saving the world). Given that Mr. Libby expressed no contrition in court after being convicted, you'd think some of his defenders might step into that moral vacuum to speak for him. But there's been so much lying surrounding this war from the start that everyone is inured to it by now. In Washington, lying no longer registers as an offense against the rule of law.
Instead the letter writers repeat tirelessly that Mr. Libby is a victim, suffering "permanent damage" to his reputation, family and career in the typical judgment of Kenneth Adelman, the foreign-policy thinker who predicted a "cakewalk" for America in Iraq. There's a whole lot of projection going on, because to judge from these letters, those who drummed up this war think of themselves as victims too. In his letter, the disgraced Paul Wolfowitz sees his friend's case as an excuse to deflect his own culpability for the fiasco. He writes that "during the spring and summer of 2003, when some others were envisioning a prolonged American occupation," Mr. Libby "was a strong advocate for a more rapid build-up of the Iraqi Army and a more rapid transfer of sovereignty to the Iraqis, points on which history will prove him to have been prescient."
History will prove no such thing; a "rapid" buildup of the Iraqi Army was and is a mirage, and the neocons' chosen leader for an instant sovereign Iraq, Ahmad Chalabi, had no political following. But Mr. Wolfowitz's real point is to pin his own catastrophic blundering on L. Paul Bremer, the neocons' chosen scapegoat for a policy that was doomed with or without Mr. Bremer's incompetent execution of the American occupation.
Of all the Libby worshipers, the one most mocked in the blogosphere and beyond is Fouad Ajami, the Lebanese-American academic and war proponent who fantasized that a liberated Iraq would have a (positive) "contagion effect" on the region and that Americans would be greeted "in Baghdad and Basra with kites and boom boxes." (I guess it all depends on your definition of "boom boxes.") In an open letter to President Bush for The Wall Street Journal op-ed page on June 8, he embroidered his initial letter to Judge Walton, likening Mr. Libby to a "fallen soldier" in the Iraq war. In Mr. Ajami's view, Tim Russert (whose testimony contradicted Mr. Libby's) and the American system of justice are untrustworthy, and "the 'covertness' of Mrs. Wilson was never convincingly and fully established." (The C.I.A. confirmed her covert status in court documents filed in May.)
Mr. Ajami notes, accurately, that the trial was "about the Iraq war and its legitimacy" — an argument that could also be mustered by defenders of Alger Hiss who felt his perjury trial was about the cold war. But it's even more revealing that the only "casualty of a war" Mr. Ajami's conscience prompts him to mention is Mr. Libby, a figurative casualty rather than a literal one.
No wonder Victoria Gotti denigrated "that mob in Washington." When the godfathers of this war speak of never leaving "a fallen comrade" on the battlefield in Iraq, as Mr. Ajami writes of Mr. Libby, they are speaking first and foremost of one another. The soldiers still making the ultimate sacrifice for this gang's hubristic folly will just have to fend for themselves.
Saturday, June 16
And if you want to find out who best represents your philosophy and values, take the Vote Match Quiz (-- Warning -- it may not be who you think!)
Thus, although he seriously believes that his extremely conservative legal opinions are in the best interests of African-Americans, and yearns to be respected by them, he is arguably one of the most viscerally despised people in black America. It is incontestable that he has benefited from affirmative action at critical moments in his life, yet he denounces the policy and has persuaded himself that it played little part in his success. He berates disadvantaged people who view themselves as victims of racism and preaches an austere individualism, yet harbors self-pitying feelings of resentment and anger at his own experiences of racism. His ardent defense of states’ rights would have required him to uphold Virginia’s anti-miscegenation law, not to mention segregated education, yet he lives with a white wife in Virginia. He is said to dislike light-skinned blacks, yet he is the legal guardian of a biracial child, the son of one of his numerous poor relatives. He frequently preaches the virtues of honesty and truthfulness, yet there is now little doubt that he lied repeatedly during his confirmation hearings — not only about his pornophilia and bawdy humor but, more important, about his legal views and familiarity with cases like Roe v. Wade.
Kevin Merida and Michael A. Fletcher conducted hundreds of interviews with Thomas’s friends, relatives and colleagues for “Supreme Discomfort,” in addition to doing extensive archival research. Although Thomas refused to be interviewed, this was not a serious handicap, given his vast paper and video trail and his volubility about his feelings. The authors superbly deconstruct Thomas’s multiple narratives of critical life-events — the accounts vary depending on his audience — and it says much for their intellectual integrity that though they are clearly critical of their subject, their presentation allows readers to make their own judgments. Thomas is examined through the prism of race because, they argue, “that is the prism through which Thomas often views himself,” and their main argument is that “he is in constant struggle with his racial identity — twisting, churning, sometimes hiding from it, but never denying it, even when he’s defiant about it.”
The first third of the book assiduously assembles the shards of his life from his birth in Pin Point, Ga., to his nomination to the Supreme Court by President George H. W. Bush in 1991, and it casts new light on the social and psychological context in which Thomas fashioned himself. Pin Point, where he spent his first six years, comes as close to a scene of rural desolation as is possible in an advanced society. This is black life in the rural South at its bleakest, in which the best hope of the law-abiding is a job at the old crab-picking factory. It is in this sociological nightmare that a 6-year-old boy, by some miracle of human agency, discovers the path to survival through absorption in books. Born to a teenage mother, abandoned by his father when he was a year old, plunged into the even more frightening poverty of the Savannah ghetto, Thomas, along with his brother, was eventually rescued by his grandparents.
Thomas has made a paragon of his maternal grandfather, Myers Anderson, an illiterate man who, through superhuman effort, native intelligence and upright living, was able to provide a fair degree of security for his family. Anderson cared deeply for the downtrodden, and the hard turn in Thomas’s adult individualism cannot be attributed to him. Indeed, it turns out that the man Thomas reveres disapproved strongly of his conservative politics.
Three other important forces shaped Thomas. In addition to white racism, he suffered the color prejudice of lighter-complexioned blacks. This dimension of black life has been so played down with the rise of identity politics that it comes as a shock to find a black person of the civil rights generation who feels he was severely scarred by it. Thomas says that growing up, he was teased mercilessly because his hair, complexion and features were too “Negroid” and that his schoolyard nickname was “ABC: America’s Blackest Child.” The authors seem inclined to believe contemporaries of Thomas who claim that he exaggerates and has confused class prejudice with color prejudice, as if class prejudice were any less execrable. On this, I’m inclined to believe Thomas, although, given where he now sits, the wife he sleeps with, the child he has custody of and the company he keeps, it might be time to get over it.
But Thomas bears the scars of yet another black prejudice: not only was he too black, he was also culturally too backcountry. Coastal Georgia is one of the few areas in America where a genuinely Afro-English creole — Gullah — is used, and Thomas grew up speaking it. In Savannah he was repeatedly mocked for his “Geechee” accent and was so traumatized by this that he developed the habit of simply listening when in public. That experience, Thomas claims, helps explain his mysterious silence on the Supreme Court during oral arguments. This seems a stretch, since Thomas is now an eloquent public speaker and an engaging conversationalist who, like most educated Southerners north of home, erased his accent long ago.
Another revealing aspect of Thomas’s upbringing is his difficult relationship with women. He is now reconciled with his mother, but for much of his life he resented and disapproved of her. She, in turn, acknowledges that she preferred his more compassionate brother, who died in 2000. The event that most angered the black community was Thomas’s public rebuke of his sister for being on welfare. The person most responsible for adopting and raising him was his step-grandmother, yet it is his grandfather, who initially spurned him and had abandoned Thomas’s own mother, who gets all the credit. His first career choice was to be a Roman Catholic priest, and he actually spent a year in a seminary, presumably anticipating a vow of chastity. For all his bawdy humor, he was extremely awkward with women, and his bookishness did not help. This hints, perhaps, at one source of his later troubles.
Up to the point of Thomas’s confirmation hearings, this book is a finely drawn portrait that surpasses all previous attempts to understand him. The remainder of the work is more wide-angled. Merida and Fletcher, who are journalists at The Washington Post, take us through the tumultuous hearings, then examine Thomas’s career and personal life up to the present: his complete embrace by the extreme right (he is a friend of Rush Limbaugh’s); his performance on the court; his relationship with Antonin Scalia, an ideological ally who some people think heavily influences Thomas’s thinking; and his secluded private life. We learn interesting things about him — for example, the stark contrast between his sometimes unfeeling legal opinions and his often compassionate personal relationships; the fact that he has quietly facilitated the confirmation of very liberal black judges, often to their amazement; and that he is probably the most accessible of the justices and enjoys the admiration and abiding loyalty of his clerks.
The treatment of Thomas’s legal doctrine, however, is pedestrian. Whatever one’s reservations about his “originalist” philosophy — notoriously, he has held that beating a prisoner is not unconstitutional punishment because it would not have appeared cruel and unusual to the framers — recent evaluations of his opinions by scholars like Henry Mark Holzer and Scott Douglas Gerber indicate that they should be taken seriously. Well, by lawyers anyway. We have also gone beyond the question of “who lied” in our assessment of the hearings. Of greater import would have been a critical examination of the bruising politics behind these hearings, the way both sides manipulated Thomas and Anita Hill, and the questionable ethics and strategic blunder of the left in focusing on Thomas’s sexuality, given America’s malignant racial history on this subject, instead of on his suspect qualifications for the job.
Nonetheless, the book remains invaluable for any understanding of the court’s most controversial figure. It persuasively makes the case that “the problem of color is a mantle” Thomas “yearns to shed, even as he clings to it.” In doing so, it brilliantly illuminates not only Thomas but his turbulent times, the burden of race in 20th-century America, and one man’s painful and unsettling struggle, along with his changing nation’s, to be relieved of it.
Orlando Patterson is a professor of sociology at Harvard and the author of “The Ordeal of Integration: Progress and Resentment in America’s ‘Racial’ Crisis.”
One day in early 1980, I bought a book and boarded a train in Philadelphia's Penn Station, intending to get off at Swarthmore. I missed the stop because I was so absorbed in the book that I never even noticed that we were pulling in and out of a series of small towns. The book was Richard Rorty's Philosophy and the Mirror of Nature, and by the time I finally got to my destination, I was an acolyte. What drew me in and held my passionate attention was not only the daring and bravado of the argument, but the extraordinary power of a style that was at once briskly colloquial—that is, without philosophical pretension—and extraordinarily precise. I later came to know that, in this case at least, the style was the man. When reading Rorty, one always hears the voice—deep, low, a bit gravely, world-weary, and so deadpan that it seems indifferent to the sentences it is uttering; sentences that are limpidly aphoristic and appearing not to do much; although in succession, like perfectly rounded bullet beads on a string, they acquire the force of a locomotive. That was surely their effect on an audience. When Rorty concluded one of his dramatically undramatic performances, the hands shot up like quivering spears, and the questions were hurled in outraged tones that were almost comically in contrast to the low-key withdrawn words that had provoked them.
Why outrage? Because more often than not a Rortyan sentence would, with irritatingly little fuss, take away everything his hearers believed in. Take, for example, this little Rortyan gem: "Time will tell; but epistemology won't." That is to say—and the fact that I have recourse to the ponderously academic circumlocution "that is to say" tells its own (for me) sad story—if you're putting your faith in some grandly ambitious account of the way we know things and hoping that if you get the account right, you will be that much closer to something called Truth, forget it; you may succeed in accomplishing the task at hand or reaching the goal you aim for, but if you do, it will not be because some normative philosophy has guided you and done most of the work, but because you've been lucky or alert enough to fashion the bits and pieces of ideas and materials at your disposal into something that hangs together, at least for the moment. Or, in other, and better words, "Time will tell; but epistemology won't."
A good way of teaching Rorty is simply to give students a baker's dozen of sentences and invite them to tease out the thought of the man who produced them. I have my own "top 10," and the list includes: "The world is out there, but descriptions of the world are not." "A conviction which can be justified to anyone is of little interest." "One would have to be very odd to change one's politics because one had become convinced, for example, that a coherence theory of truth was preferable to a correspondence theory." "What counts as rational argumentation is as historically determined and as context-dependent, as what counts as good French." "It seems to me that I am just as provisional and contextualist as the Nazi teachers who made their students read Der Sturmer; the only difference is that I serve a better cause."
That better cause is the cause of expanding and extending our "sense of 'we' " and bringing more and more persons and vocabularies under the same ecumenical umbrella. At times the ecumenism could be disconcerting. Once at a conference Rorty indicated agreement with an account of his work that seemed to me to be antithetical to its very core. I rose and said so, and he agreed with me, too. I thought, no, it has to be one or the other of us. I still hadn't learned the lesson he was teaching, and now, like everyone else, I will be trying to do so in his absence.
Wednesday, June 13
The immediate occasion of this reflection is this Wall Street Journal op-ed in which I. Lewis “Scooter” Libby is hailed as a “fallen soldier.” The essay is written as a letter to George Bush, and it concludes rather eloquently:
The prosecutor, and the jury and the judge, had before them a case that purported to stand alone, a trial of one man’s memory and recollections. But you have before you what they and the rest of us don’t — a memory of the passions and the panic, and the certitude, which gave rise to the war. And a sense, I am confident, of the quiet and selfless man who sat in the outer circle when your cabinet deliberated over our country’s choices in Iraq, and in those burning grounds of the Arab-Islamic world. Scooter Libby was there for the beginning of that campaign. He can’t be left behind as a casualty of a war our country had once proudly claimed as its own.
Eloquent, I say, and yet somehow . . . what is le mot juste? Ah, yes, I know: batshit insane. Indeed, the more you break it down, the more insane it gets: if Bush fails to pardon Libby, he is leaving behind “a soldier in your — our — war in Iraq.” (Somehow I imagine that Libby will fare better in later life than most of the actual soldiers languishing in Walter Reed and in VA hospitals across the country.) For Libby was engaged in that most noble enterprise of all, helping to blow a covert agent’s cover in order to help the Bush Administration make a fraudulent case for war.
Still, I must admit that there is at least one paragraph in this essay that sounds as if it were written by an undergraduate on an all-nighter:
In “The Soldier’s Creed,” there is a particularly compelling principle: “I will never leave a fallen comrade.” This is a cherished belief, and it has been so since soldiers and chroniclers and philosophers thought about wars and great, common endeavors. Across time and space, cultures, each in its own way, have given voice to this most basic of beliefs. They have done it, we know, to give heart to those who embark on a common mission, to give them confidence that they will not be given up under duress. A process that yields up Scooter Libby to a zealous prosecutor is justice gone awry.
If, like me, you’ve graded stacks of survey-class essays, you know the idiom of those first three sentences: From the beginning of time, readers have thrilled to the brave deeds recorded in the Iliad…. And yet, folks, the author of this essay is no callow undergraduate. He is no Victor Davis Hanson wannabe from the depths of NewsMax. He is no Dorito-flecked lieutenant in the 101st Fighting Keyboarders. He is, rather, Professor Fouad Ajami, the Director of the Middle East Studies Program at the School of Advanced International Studies of Johns Hopkins University. Yes, I know, SAIS is Neocon Central Headquarters. But unlike, say, The Corner, it is serious business. And it is home to some Very Serious People — like Ajami, who sits on the Board of Advistors of the journal Foreign Affairs and counts among his many awards and distinctions a 1982 MacArthur “genius” fellowship in the arts and sciences.
So where, I ask you, is the MacArthur-level Golden Winger for him?
Or take the case of poor Robert Brustein. About a month ago, in response to the latest “study” published by the American Council of Trustees and Alumni (official motto: David Horowitz with a human face), Brustein published a little screed in TNR’s “Open University.” Before I get to the nut graf (so to speak), let me explain a bit of the context.
Every so often, outfits like ACTA put out these “studies” in which they “demonstrate” that (a) professors hate America, (b) Ward Churchill is everywhere, and (c) professors hate Shakespeare. I am not exaggerating (very much), I assure you. One of ACTA’s recent pamphlets (published in May 2006) was indeed called How Many Ward Churchills? (.pdf) You’d think they would milk the suspense — dear me! just how many Ward Churchills are indoctrinating our impressionable little children? — but apparently ACTA didn’t think its readership would have much of an attention span, since the pamphlet starts on page one with the heading “How Many Ward Churchills?” and proceeds to conclude on page two of a 50-page booklet that “Ward Churchill is Everywhere.” You know, sorta like Elvis.
This year, they’ve come up with a new “study,” The Vanishing Shakespeare (another .pdf). Now, in order to appreciate this kind of work, you have to consider its intended audience. (I’m told that the original title was ZOMG! They Are Killing Shakespeare OH NOES!!, which I think captures the spirit of the thing.) The primary audience, of course, consists of people who know nothing about English literature (or college courses in English literature) except that Shakespeare was America’s greatest writer and that Hamlet’s soliloquy from Adventures of Huckleberry Finn is truly sublime. They even open the “study” with the tag line, “Oh! What a noble mind is here o’erthrown,” which, I believe, is very literary. (Although that’s actually Ophelia talking about Hamlet’s “madness,” so it’s a kinda embarrassingly bad literary allusion if you’re trying to suggest that Shakespeare is being abandoned for Dangeral Studies.) And then they follow it up with another very literary reference: “the American Council of Trustees and Alumni researched how Shakespeare fits into English curricula at 70 of the nation’s leading colleges and universities. What we found is, to quote the Bard, ‘the unkindest cut of all.’” Zounds! The unkindest cut! That is extra extra literary. Certainly the authors of this study have brushed up their Shakespeare! Indeed, perhaps English professors are honoring the Bard more in the breach than the observance! That should make everyone concerned with higher education outraged that our college campuses are like the undiscovered country from whose bourne no traveler returns, breathes forth contagion on the world, and thus the native hue of resolution, like the poor cat i’ the adage, is sicklied o’er with care!!!
But there’s a secondary audience, as well — an audience made up of sundry Andrew Aguecheeks who are (a) generally knowledgeable enough about English literature to know who Andrew Aguecheek is and (b) always ready and willing to believe whatever the wingnut culture warriors tell them about Dangeral Professors Today and Why We Hate Shakespeare. Last time around, it worked like a charm — yes, dear readers, believe it or not, ACTA has actually pulled this rabbit out of this old hat before! Back in 1996, when ACTA was known as the National Alumni Forum and was led by Lynne Cheney, they put out the exact same kind of “study”, which was duly reported in the New York Times under the headline, “At Colleges, Sun Is Setting on Shakespeare.” Hey, so maybe that’s why Shakespeare is vanishing! The sun was going down on him in 1996, and now no one can see him! That’s what happens when the sun goes down! That’s why Shakespeare wrote the immortal sonnet, “Don’t Let the Sun Go Down On Me.”
(One serious aside about these “studies.” They survey English departments to see how many of us require classes in Shakespeare. The funny thing is that most English departments don’t need to have Shakespeare requirements. Believe me, Shakespeare is doing just fine in English departments, and he’s been doing fine for a good long time. In fact, if someone were actually serious about trying to find out about Shakespeare’s place in the English curriculum, s/he would look at the total number of Shakespeare courses offered in English departments. But that’s only if someone were actually serious about this kind of thing.)
All right, now back to Brustein. Brustein starts off like so:
A recent report isued by the American Council of Trustees and Alumni documents what we already know: Shakespeare is no longer valued in our educational system.
In other words, ZOMG! They are killing Shakespeare OH NOES! And look what they’re replacing him with:
The report takes note that abandoning Shakespeare does not mean that our higher education system is abandoning requirements. No, it’s just that there are now more urgent things to include on the reading list. “While Shakespeare and other traditionally acclaimed authors such as Chaucer and Milton are no longer required, many institutions such as Rice, Oberlin, and Vanderbilt require students to study ‘non-canonical traditions,’ ‘under-represented cultures,’ and ‘ethnic or non-Western literature.’”
“And at the University of Virginia,” the report continues, “English majors can avoid reading Othello in favor of studying ‘Critical Race Theory’ which explores why race ‘continues to have vital significance in politics, education, culture, arts, and everyday social realities,’ including ’sexuality, class, disability, multiculturalism, nationality, and globalism.’”
But now for the nut graf. After swallowing the ACTA line whole, and getting the ague in his cheek as a result, Brustein concludes somewhat less eloquently than did Ajami on Libby:
A recent newspaper cartoon shows two young girls walking out of a school. One turns to the other and says, “I have two mommies.” The other replies, “How much is two?”
All right. I may be reading too much into this, but don’t blame me — I learned how to read from guys like Shakespeare. And to my ears, this sounds a little bit strange. It sounds like Robert Brustein, the founding director of the Yale Repertory Theater and Harvard’s American Repertory Theater, the former dean of the Yale School of Drama, a senior research fellow at Harvard, and the theater critic of the Even the Liberal New Republic, has suggested that ACTA’s latest ZOMG They Are Killing Shakespeare OH NOES study has something to do with (a) a young girl who says she has two mommies and (b) another young girl who does not know how to count to two. And again, I don’t want to “interpret” this in a way that violates the “author’s” “intention” (especially since it is a “citation” and not a “use” of the cartoon), but it sounds a little bit like Robert Brustein, the founding director of the Yale Repertory Theater and Harvard’s American Repertory Theater, the former dean of the Yale School of Drama, a senior research fellow at Harvard, and the theater critic of the Even the Liberal New Republic, is making some kind of correlation between Heather Has Two Mommies and children who are unable to count. It almost sounds as if Brustein is suggesting that These Kids Today learn a lot about lesbianism in Our Schools Today but nothing about simple math (and, uh, therefore, Shakespeare). And it sounds as if Brustein is now getting his material from Mallard Fillmore.
If that’s the case, I ask again: where are the Kippies for such as these? For oh! what semi-noble minds are here o’erthrown!
Tuesday, June 12
In "Digital Maosim", an original essay written for Edge, computer scientist and digital visionary Jaron Lanier finds fault with what he terms the new online collectivism. He cites as an example the Wikipedia, noting that "reading a Wikipedia entry is like reading the bible closely. There are faint traces of the voices of various anonymous authors and editors, though it is impossible to be sure".
His problem is not with the unfolding experiment of the Wikipedia itself, but "the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous".
And he notes that "the Wikipedia is far from being the only online fetish site for foolish collectivism. There's a frantic race taking place online to become the most "Meta" site, to be the highest level aggregator, subsuming the identity of all other sites".
Where is this leading? Lanier calls attention to the "so-called 'Artificial Intelligence' and the race to erase personality and be most Meta. In each case, there's a presumption that something like a distinct kin to individual human intelligence is either about to appear any minute, or has already appeared. The problem with that presumption is that people are all too willing to lower standards in order to make the purported newcomer appear smart. Just as people are willing to bend over backwards and make themselves stupid in order to make an AI interface appear smart (as happens when someone can interact with the notorious Microsoft paper clip,) so are they willing to become uncritical and dim in order to make Meta-aggregator sites appear to be coherent."
Read on as Jaron Lanier throwns a lit Molotov cocktail down towards Palo Alto from up in the Berkeley Hills...
by JARON LANIER
My Wikipedia entry identifies me (at least this week) as a film director. It is true I made one experimental short film about a decade and a half ago. The concept was awful: I tried to imagine what Maya Deren would have done with morphing. It was shown once at a film festival and was never distributed and I would be most comfortable if no one ever sees it again.
In the real world it is easy to not direct films. I have attempted to retire from directing films in the alternative universe that is the Wikipedia a number of times, but somebody always overrules me. Every time my Wikipedia entry is corrected, within a day I'm turned into a film director again. I can think of no more suitable punishment than making these determined Wikipedia goblins actually watch my one small old movie.
Twice in the past several weeks, reporters have asked me about my filmmaking career. The fantasies of the goblins have entered that portion of the world that is attempting to remain real. I know I've gotten off easy. The errors in my Wikipedia bio have been (at least prior to the publication of this article) charming and even flattering.
Reading a Wikipedia entry is like reading the bible closely. There are faint traces of the voices of various anonymous authors and editors, though it is impossible to be sure. In my particular case, it appears that the goblins are probably members or descendants of the rather sweet old Mondo 2000 culture linking psychedelic experimentation with computers. They seem to place great importance on relating my ideas to those of the psychedelic luminaries of old (and in ways that I happen to find sloppy and incorrect.) Edits deviating from this set of odd ideas that are important to this one particular small subculture are immediately removed. This makes sense. Who else would volunteer to pay that much attention and do all that work?
The problem I am concerned with here is not the Wikipedia in itself. It's been criticized quite a lot, especially in the last year, but the Wikipedia is just one experiment that still has room to change and grow. At the very least it's a success at revealing what the online people with the most determination and time on their hands are thinking, and that's actually interesting information.
No, the problem is in the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous.
There was a well-publicized study in Nature last year comparing the accuracy of the Wikipedia to Encyclopedia Britannica. The results were a toss up, while there is a lingering debate about the validity of the study. The items selected for the comparison were just the sort that Wikipedia would do well on: Science topics that the collective at large doesn't care much about. "Kinetic isotope effect" or "Vesalius, Andreas" are examples of topics that make the Britannica hard to maintain, because it takes work to find the right authors to research and review a multitude of diverse topics. But they are perfect for the Wikipedia. There is little controversy around these items, plus the Net provides ready access to a reasonably small number of competent specialist graduate student types possessing the manic motivation of youth.
A core belief of the wiki world is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds. This is analogous to the claims of Hyper-Libertarians who put infinite faith in a free market, or the Hyper-Lefties who are somehow able to sit through consensus decision-making processes. In all these cases, it seems to me that empirical evidence has yielded mixed results. Sometimes loosely structured collective activities yield continuous improvements and sometimes they don't. Often we don't live long enough to find out. Later in this essay I'll point out what constraints make a collective smart. But first, it's important to not lose sight of values just because the question of whether a collective can be smart is so fascinating. Accuracy in a text is not enough. A desirable text is more than a collection of accurate references. It is also an expression of personality.
For instance, most of the technical or scientific information that is in the Wikipedia was already on the Web before the Wikipedia was started. You could always use Google or other search services to find information about items that are now wikified. In some cases I have noticed specific texts get cloned from original sites at universities or labs onto wiki pages. And when that happens, each text loses part of its value. Since search engines are now more likely to point you to the wikified versions, the Web has lost some of its flavor in casual use.
When you see the context in which something was written and you know who the author was beyond just a name, you learn so much more than when you find the same text placed in the anonymous, faux-authoritative, anti-contextual brew of the Wikipedia. The question isn't just one of authentication and accountability, though those are important, but something more subtle. A voice should be sensed as a whole. You have to have a chance to sense personality in order for language to have its full meaning. Personal Web pages do that, as do journals and books. Even Britannica has an editorial voice, which some people have criticized as being vaguely too "Dead White Men."
If an ironic Web site devoted to destroying cinema claimed that I was a filmmaker, it would suddenly make sense. That would be an authentic piece of text. But placed out of context in the Wikipedia, it becomes drivel.
Myspace is another recent experiment that has become even more influential than the Wikipedia. Like the Wikipedia, it adds just a little to the powers already present on the Web in order to inspire a dramatic shift in use. Myspace is all about authorship, but it doesn't pretend to be all-wise. You can always tell at least a little about the character of the person who made a Myspace page. But it is very rare indeed that a Myspace page inspires even the slightest confidence that the author is a trustworthy authority. Hurray for Myspace on that count!
Myspace is a richer, multi-layered, source of information than the Wikipedia, although the topics the two services cover barely overlap. If you want to research a TV show in terms of what people think of it, Myspace will reveal more to you than the analogous and enormous entries in the Wikipedia.
The Wikipedia is far from being the only online fetish site for foolish collectivism. There's a frantic race taking place online to become the most "Meta" site, to be the highest level aggregator, subsuming the identity of all other sites.
The race began innocently enough with the notion of creating directories of online destinations, such as the early incarnations of Yahoo. Then came AltaVista, where one could search using an inverted database of the content of the whole Web. Then came Google, which added page rank algorithms. Then came the blogs, which varied greatly in terms of quality and importance. This lead to Meta-blogs such as Boing Boing, run by identified humans, which served to aggregate blogs. In all of these formulations, real people were still in charge. An individual or individuals were presenting a personality and taking responsibility.
These Web-based designs assumed that value would flow from people. It was still clear, in all such designs, that the Web was made of people, and that ultimately value always came from connecting with real humans.
Even Google by itself (as it stands today) isn't Meta enough to be a problem. One layer of page ranking is hardly a threat to authorship, but an accumulation of many layers can create a meaningless murk, and that is another matter.
In the last year or two the trend has been to remove the scent of people, so as to come as close as possible to simulating the appearance of content emerging out of the Web as if it were speaking to us as a supernatural oracle. This is where the use of the Internet crosses the line into delusion.
Kevin Kelly, the former editor of Whole Earth Review and the founding Executive Editor of Wired, is a friend and someone who has been thinking about what he and others call the "Hive Mind." He runs a Website called Cool Tools that's a cross between a blog and the old Whole Earth Catalog. On Cool Tools, the contributors, including me, are not a hive because we are identified.
In March, Kelly reviewed a variety of "Consensus Web filters" such as "Digg" and "Reddit" that assemble material every day from all the myriad of other aggregating sites. Such sites intend to be more Meta than the sites they aggregate. There is no person taking responsibility for what appears on them, only an algorithm. The hope seems to be that the most Meta site will become the mother of all bottlenecks and receive infinite funding.
That new magnitude of Meta-ness lasted only amonth. In April, Kelly reviewed a site called "popurls" that aggregates consensus Web filtering sites...and there was a new "most Meta". We now are reading what a collectivity algorithm derives from what other collectivity algorithms derived from what collectives chose from what a population of mostly amateur writers wrote anonymously.
Is "popurls" any good? I am writing this on May 27, 2006. In the last few days an experimental approach to diabetes management has been announced that might prevent nerve damage. That's huge news for tens of millions of Americans. It is not mentioned on popurls. Popurls does clue us in to this news: "Student sets simultaneous world ice cream-eating record, worst ever ice cream headache." Mainstream news sources all lead today with a serious earthquake in Java. Popurls includes a few mentions of the event, but they are buried within the aggregation of aggregate news sites like Google News. The reason the quake appears on popurls at all can be discovered only if you dig through all the aggregating layers to find the original sources, which are those rare entries actually created by professional writers and editors who sign their names. But at the layer of popurls, the ice cream story and the Javanese earthquake are at best equals, without context or authorship.
Kevin Kelly says of the "popurls" site, "There's no better way to watch the hive mind." But the hive mind is for the most part stupid and boring. Why pay attention to it?
Readers of my previous rants will notice a parallel between my discomfort with so-called "Artificial Intelligence" and the race to erase personality and be most Meta. In each case, there's a presumption that something like a distinct kin to individual human intelligence is either about to appear any minute, or has already appeared. The problem with that presumption is that people are all too willing to lower standards in order to make the purported newcomer appear smart. Just as people are willing to bend over backwards and make themselves stupid in order to make an AI interface appear smart (as happens when someone can interact with the notorious Microsoft paper clip,) so are they willing to become uncritical and dim in order to make Meta-aggregator sites appear to be coherent.
There is a pedagogical connection between the culture of Artificial Intelligence and the strange allure of anonymous collectivism online. Google's vast servers and the Wikipedia are both mentioned frequently as being the startup memory for Artificial Intelligences to come. Larry Page is quoted via a link presented to me by popurls this morning (who knows if it's accurate) as speculating that an AI might appear within Google within a few years. George Dyson has wondered if such an entity already exists on the Net, perhaps perched within Google. My point here is not to argue about the existence of Metaphysical entities, but just to emphasize how premature and dangerous it is to lower the expectations we hold for individual human intellects.
The beauty of the Internet is that it connects people. The value is in the other people. If we start to believe that the Internet itself is an entity that has something to say, we're devaluing those people and making ourselves into idiots.
Compounding the problem is that new business models for people who think and write have not appeared as quickly as we all hoped. Newspapers, for instance, are on the whole facing a grim decline as the Internet takes over the feeding of curious eyes that hover over morning coffee and even worse, classified ads. In the new environment, Google News is for the moment better funded and enjoys a more secure future than most of the rather small number of fine reporters around the world who ultimately create most of its content. The aggregator is richer than the aggregated.
The question of new business models for content creators on the Internet is a profound and difficult topic in itself, but it must at least be pointed out that writing professionally and well takes time and that most authors need to be paid to take that time. In this regard, blogging is not writing. For example, it's easy to be loved as a blogger. All you have to do is play to the crowd. Or you can flame the crowd to get attention. Nothing is wrong with either of those activities. What I think of as real writing, however, writing meant to last, is something else. It involves articulating a perspective that is not just reactive to yesterday's moves in a conversation.
The artificial elevation of all things Meta is not confined to online culture. It is having a profound influence on how decisions are made in America.
What we are witnessing today is the alarming rise of the fallacy of the infallible collective. Numerous elite organizations have been swept off their feet by the idea. They are inspired by the rise of the Wikipedia, by the wealth of Google, and by the rush of entrepreneurs to be the most Meta. Government agencies, top corporate planning departments, and major universities have all gotten the bug.
As a consultant, I used to be asked to test an idea or propose a new one to solve a problem. In the last couple of years I've often been asked to work quite differently. You might find me and the other consultants filling out survey forms or tweaking edits to a collective essay. I'm saying and doing much less than I used to, even though I'm still being paid the same amount. Maybe I shouldn't complain, but the actions of big institutions do matter, and it's time to speak out against the collectivity fad that is upon us.
It's not hard to see why the fallacy of collectivism has become so popular in big organizations: If the principle is correct, then individuals should not be required to take on risks or responsibilities. We live in times of tremendous uncertainties coupled with infinite liability phobia, and we must function within institutions that are loyal to no executive, much less to any lower level member. Every individual who is afraid to say the wrong thing within his or her organization is safer when hiding behind a wiki or some other Meta aggregation ritual.
I've participated in a number of elite, well-paid wikis and Meta-surveys lately and have had a chance to observe the results. I have even been part of a wiki about wikis. What I've seen is a loss of insight and subtlety, a disregard for the nuances of considered opinions, and an increased tendency to enshrine the official or normative beliefs of an organization. Why isn't everyone screaming about the recent epidemic of inappropriate uses of the collective? It seems to me the reason is that bad old ideas look confusingly fresh when they are packaged as technology.
The collective rises around us in multifarious ways. What afflicts big institutions also afflicts pop culture. For instance, it has become notoriously difficult to introduce a new pop star in the music business. Even the most successful entrants have hardly ever made it past the first album in the last decade or so. The exception is American Idol. As with the Wikipedia, there's nothing wrong with it. The problem is its centrality.
More people appear to vote in this pop competition than in presidential elections, and one reason for this is the instant convenience of information technology. The collective can vote by phone or by texting, and some vote more than once. The collective is flattered and it responds. The winners are likable, almost by definition.
But John Lennon wouldn't have won. He wouldn't have made it to the finals. Or if he had, he would have ended up a different sort of person and artist. The same could be said about Jimi Hendrix, Elvis, Joni Mitchell, Duke Ellington, David Byrne, Grandmaster Flash, Bob Dylan (please!), and almost anyone else who has been vastly influential in creating pop music.
As below, so above. The New York Times, of all places, has recently published op-ed pieces supporting the pseudo-idea of intelligent design. This is astonishing. The Times has become the paper of averaging opinions. Something is lost when American Idol becomes a leader instead of a follower of pop music. But when intelligent design shares the stage with real science in the paper of record, everything is lost.
How could the Times have fallen so far? I don't know, but I would imagine the process was similar to what I've seen in the consulting world of late. It's safer to be the aggregator of the collective. You get to include all sorts of material without committing to anything. You can be superficially interesting without having to worry about the possibility of being wrong.
Except when intelligent thought really matters. In that case the average idea can be quite wrong, and only the best ideas have lasting value. Science is like that.
The collective isn't always stupid. In some special cases the collective can be brilliant. For instance, there's a demonstrative ritual often presented to incoming students at business schools. In one version of the ritual, a large jar of jellybeans is placed in the front of a classroom. Each student guesses how many beans there are. While the guesses vary widely, the average is usually accurate to an uncanny degree.
This is an example of the special kind of intelligence offered by a collective. It is that peculiar trait that has been celebrated as the "Wisdom of Crowds," though I think the word "wisdom" is misleading. It is part of what makes Adam Smith's Invisible Hand clever, and is connected to the reasons Google's page rank algorithms work. It was long ago adapted to futurism, where it was known as the Delphi technique. The phenomenon is real, and immensely useful.
But it is not infinitely useful. The collective can be stupid, too. Witness tulip crazes and stock bubbles. Hysteria over fictitious satanic cult child abductions. Y2K mania.
The reason the collective can be valuable is precisely that its peaks of intelligence and stupidity are not the same as the ones usually displayed by individuals. Both kinds of intelligence are essential.
What makes a market work, for instance, is the marriage of collective and individual intelligence. A marketplace can't exist only on the basis of having prices determined by competition. It also needs entrepreneurs to come up with the products that are competing in the first place.
In other words, clever individuals, the heroes of the marketplace, ask the questions which are answered by collective behavior. They put the jellybeans in the jar.
There are certain types of answers that ought not be provided by an individual. When a government bureaucrat sets a price, for instance, the result is often inferior to the answer that would come from a reasonably informed collective that is reasonably free of manipulation or runaway internal resonances. But when a collective designs a product, you get design by committee, which is a derogatory expression for a reason.
Here I must take a moment to comment on Linux and similar efforts. The various formulations of "open" or "free" software are different from the Wikipedia and the race to be most Meta in important ways. Linux programmers are not anonymous and in fact personal glory is part of the motivational engine that keeps such enterprises in motion. But there are similarities, and the lack of a coherent voice or design sensibility in an esthetic sense is one negative quality of both open source software and the Wikipedia.
These movements are at their most efficient while building hidden information plumbing layers, such as Web servers. They are hopeless when it comes to producing fine user interfaces or user experiences. If the code that ran the Wikipedia user interface were as open as the contents of the entries, it would churn itself into impenetrable muck almost immediately. The collective is good at solving problems which demand results that can be evaluated by uncontroversial performance parameters, but it is bad when taste and judgment matter.
Collectives can be just as stupid as any individual, and in important cases, stupider. The interesting question is whether it's possible to map out where the one is smarter than the many.
There is a lot of history to this topic, and varied disciplines have lots to say. Here is a quick pass at where I think the boundary between effective collective thought and nonsense lies: The collective is more likely to be smart when it isn't defining its own questions, when the goodness of an answer can be evaluated by a simple result (such as a single numeric value,) and when the information system which informs the collective is filtered by a quality control mechanism that relies on individuals to a high degree. Under those circumstances, a collective can be smarter than a person. Break any one of those conditions and the collective becomes unreliable or worse.
Meanwhile, an individual best achieves optimal stupidity on those rare occasions when one is both given substantial powers and insulated from the results of his or her actions.
If the above criteria have any merit, then there is an unfortunate convergence. The setup for the most stupid collective is also the setup for the most stupid individuals.
Every authentic example of collective intelligence that I am aware of also shows how that collective was guided or inspired by well-meaning individuals. These people focused the collective and in some cases also corrected for some of the common hive mind failure modes. The balancing of influence between people and collectives is the heart of the design of democracies, scientific communities, and many other long-standing projects. There's a lot of experience out there to work with. A few of these old ideas provide interesting new ways to approach the question of how to best use the hive mind.
The pre-Internet world provides some great examples of how personality-based quality control can improve collective intelligence. For instance, an independent press provides tasty news about politicians by reporters with strong voices and reputations, like the Watergate reporting of Woodward and Bernstein. Other writers provide product reviews, such as Walt Mossberg in The Wall Street Journal and David Pogue in The New York Times. Such journalists inform the collective's determination of election results and pricing. Without an independent press, composed of heroic voices, the collective becomes stupid and unreliable, as has been demonstrated in many historical instances. (Recent events in America have reflected the weakening of the press, in my opinion.)
Scientific communities likewise achieve quality through a cooperative process that includes checks and balances, and ultimately rests on a foundation of goodwill and "blind" elitism — blind in the sense that ideally anyone can gain entry, but only on the basis of a meritocracy. The tenure system and many other aspects of the academy are designed to support the idea that individual scholars matter, not just the process or the collective.
Another example: Entrepreneurs aren't the only "heroes" of a marketplace. The role of a central bank in an economy is not the same as that of a communist party official in a centrally planned economy. Even though setting an interest rate sounds like the answering of a question, it is really more like the asking of a question. The Fed asks the market to answer the question of how to best optimize for lowering inflation, for instance. While that might not be the question everyone would want to have asked, it is at least coherent.
Yes, there have been plenty of scandals in government, the academy and in the press. No mechanism is perfect, but still here we are, having benefited from all of these institutions. There certainly have been plenty of bad reporters, self-deluded academic scientists, incompetent bureaucrats, and so on. Can the hive mind help keep them in check? The answer provided by experiments in the pre-Internet world is "yes," but only provided some signal processing is placed in the loop.
Some of the regulating mechanisms for collectives that have been most successful in the pre-Internet world can be understood in part as modulating the time domain. For instance, what if a collective moves too readily and quickly, jittering instead of settling down to provide a single answer? This happens on the most active Wikipedia entries, for example, and has also been seen in some speculation frenzies in open markets.
One service performed by representative democracy is low-pass filtering. Imagine the jittery shifts that would take place if a wiki were put in charge of writing laws. It's a terrifying thing to consider. Super-energized people would be struggling to shift the wording of the tax-code on a frantic, never-ending basis. The Internet would be swamped.
Such chaos can be avoided in the same way it already is, albeit imperfectly, by the slower processes of elections and court proceedings. The calming effect of orderly democracy achieves more than just the smoothing out of peripatetic struggles for consensus. It also reduces the potential for the collective to suddenly jump into an over-excited state when too many rapid changes to answers coincide in such a way that they don't cancel each other out. (Technical readers will recognize familiar principles in signal processing.)
The Wikipedia has recently slapped a crude low pass filter on the jitteriest entries, such as "President George W. Bush." There's now a limit to how often a particular person can remove someone else's text fragments. I suspect that this will eventually have to evolve into an approximate mirror of democracy as it was before the Internet arrived.
The reverse problem can also appear. The hive mind can be on the right track, but moving too slowly. Sometimes collectives would yield brilliant results given enough time but there isn't enough time. A problem like global warming would automatically be addressed eventually if the market had enough time to respond to it, for instance. Insurance rates would climb, and so on. Alas, in this case there isn't enough time, because the market conversation is slowed down by the legacy effect of existing investments. Therefore some other process has to intervene, such as politics invoked by individuals.
Another example of the slow hive problem: There was a lot of technology developed slowly in the millennia before there was a clear idea of how to be empirical, how to have a peer reviewed technical literature and an education based on it, and before there was an efficient market to determine the value of inventions. What is crucial to notice about modernity is that structure and constraints were part of what sped up the process of technological development, not just pure openness and concessions to the collective.
Let's suppose that the Wikipedia will indeed become better in some ways, as is claimed by the faithful, over a period of time. We might still need something better sooner.
Some wikitopians explicitly hope to see education subsumed by wikis. It is at least possible that in the fairly near future enough communication and education will take place through anonymous Internet aggregation that we could become vulnerable to a sudden dangerous empowering of the hive mind. History has shown us again and again that a hive mind is a cruel idiot when it runs on autopilot. Nasty hive mind outbursts have been flavored Maoist, Fascist, and religious, and these are only a small sampling. I don't see why there couldn't be future social disasters that appear suddenly under the cover of technological utopianism. If wikis are to gain any more influence they ought to be improved by mechanisms like the ones that have worked tolerably well in the pre-Internet world.
The hive mind should be thought of as a tool. Empowering the collective does not empower individuals — just the reverse is true. There can be useful feedback loops set up between individuals and the hive mind, but the hive mind is too chaotic to be fed back into itself.
These are just a few ideas about how to train a potentially dangerous collective and not let it get out of the yard. When there's a problem, you want it to bark but not bite you.
The illusion that what we already have is close to good enough, or that it is alive and will fix itself, is the most dangerous illusion of all. By avoiding that nonsense, it ought to be possible to find a humanistic and practical way to maximize value of the collective on the Web without turning ourselves into idiots. The best guiding principle is to always cherish individuals first.